Head Motion Analysis and Synthesis over Different Tasks

Atef Ben Youssef, Hiroshi Shimodaira, David A. Braude

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

It is known that subjects vary in their head movements. This paper presents an analysis of this variety over different tasks and speakers and their impact on head motion synthesis. Measured head and articulatory movements acquired by an ElectroMagnetic Articulograph (EMA) synchronously recorded with audio was used. Data set of speech of 12 people recorded on different tasks confirms that the head motion variate over tasks and speakers. Experimental results confirmed that the proposed models were capable of learning and synthesising task-dependent head motions from speech. Subjective evaluation of synthesised head motion using task models shows that trained models on the matched task is better than mismatched one and free speech data provide models that predict preferred motion by the participants compared to read speech data.
Original languageEnglish
Title of host publicationIntelligent Virtual Agents
Subtitle of host publication13th International Conference, IVA 2013, Edinburgh, UK, August 29-31, 2013. Proceedings
EditorsRuth Aylett, Brigitte Krenn, Catherine Pelachaud, Hiroshi Shimodaira
PublisherSpringer-Verlag GmbH
Pages285-294
Number of pages10
ISBN (Electronic)978-3-642-40415-3
ISBN (Print)978-3-642-40414-6
DOIs
Publication statusPublished - Sep 2013

Publication series

NameLecture Notes in Computer Science
PublisherSpringer Berlin / Heidelberg
Volume8108
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Keywords

  • head motion variety
  • head motion synthesis

Fingerprint

Dive into the research topics of 'Head Motion Analysis and Synthesis over Different Tasks'. Together they form a unique fingerprint.

Cite this