Abstract
A robot coexisting with humans must not only be able to perform physical tasks, but must also be able to interact with humans in a socially appropriate manner. We describe an application of planning to task-based social interaction using a robot that must interact with multiple human agents in a simple bartending domain. The resulting system infers social states from low-level sensors, using vision and speech as input modalities, and uses the knowledge-level PKS planner to construct plans with task, dialogue, and social actions.
Original language | English |
---|---|
Title of host publication | Proceedings of the ICAPS 2013 Application Showcase |
Pages | 10-13 |
Number of pages | 4 |
Publication status | Published - 1 Jun 2013 |