Supporting an online investigation of user interaction with an XAIP Agent

Alan Lindsay, Bart Craenen, Sara Dalzel-Job, Robin L. Hill, Ronald P. A. Petrick

Research output: Contribution to conferencePaper

Abstract / Description of output

Human interaction relies on a wide range of signals, including non-verbal cues. In order to develop effective Explainable Planning (XAIP) agents it is important that we understand the range and utility of these communication channels. Our intention is to develop an interactive agent, whose behaviour is conditioned on the affective measures of the user (i.e., explicitly incorporating the user’s affective state within the planning model). Accurate prediction of user affective state relies on real-time analysis of various predictors, which can require specialist equipment and calibration. However, the worldwide COVID-19 lockdown has meant that many intended lab-based experiments have now been moved online, making such real-time analysis impractical. As a result, we have developed a website to support a data gathering experiment, including a video stream (for facial expression analysis) with access to mouse positions and task performance, providing rich observations of the users as they interact with the agent and its plan. Underlying this system is the agent’s behaviour strategy, which must be computed in advance and captured efficiently. This paper describes the built system and the challenges we faced getting it ready for deployment.
Original languageEnglish
Publication statusPublished - 22 Oct 2020
Event30th International Conference on Automated Planning and Scheduling - Nancy, France
Duration: 26 Oct 202030 Oct 2020


Conference30th International Conference on Automated Planning and Scheduling
Abbreviated titleICAPS 2020
Internet address


Dive into the research topics of 'Supporting an online investigation of user interaction with an XAIP Agent'. Together they form a unique fingerprint.

Cite this