Planning for Social Interaction in a Robot Bartender Domain

Ronald P. A. Petrick, Mary Ellen Foster

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

A robot coexisting with humans must not only be able to perform physical tasks, but must also be able to interact with humans in a socially appropriate manner. In many social settings, this involves the use of social signals like gaze, facial expression, and language. In this paper, we describe an application of planning to task-based social interaction using a robot that must interact with multiple human agents in a simple bartending domain. We show how social states are inferred from low-level sensors, using vision and speech as input modalities, and how we use the knowledge-level PKS planner to construct plans with task, dialogue, and social actions, as an alternative to current mainstream methods of interaction management. The resulting system has been evaluated in a real-world study with human subjects.
Original languageEnglish
Title of host publicationProceedings of the 23rd International Conference on Automated Planning and Scheduling (ICAPS 2013), Special Track on Novel Applications
PublisherAAAI Press
Pages389-397
Number of pages9
ISBN (Print)978-1-57735-609-7
Publication statusPublished - 1 Jun 2013

Fingerprint

Dive into the research topics of 'Planning for Social Interaction in a Robot Bartender Domain'. Together they form a unique fingerprint.

Cite this