Knowledge-Level Planning for Task-Based Social Interaction

Ronald P. A. Petrick, Mary Ellen Foster

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

A robot coexisting with humans must not only be able to perform physical tasks, but must also be able to interact with humans in a socially appropriate manner. In many social settings, this involves the use of social signals like gaze, facial expression, and language. In this paper, we discuss the problem of planning social and task-based actions for a robot that must interact with multiple human agents in a dynamic domain. We show how social states are inferred from low-level sensors, using vision and speech as input modalities, and use a general purpose knowledge-level planner to model task, dialogue, and social actions, as an alternative to current mainstream methods of interaction management. The resulting system has been evaluated in a real-world study with human subjects, in a simple bartending scenario.
Original languageEnglish
Title of host publicationWorkshop of the UK Planning and Scheduling Special Interest Group (PlanSIG 2012)
Number of pages8
Publication statusPublished - 1 Dec 2012

Fingerprint

Dive into the research topics of 'Knowledge-Level Planning for Task-Based Social Interaction'. Together they form a unique fingerprint.

Cite this