An HPSG Approach to Synchronous Speech and Deixis

Katya Alahverdzhieva, Alex Lascarides

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

The use of hand gestures to point at objects and individuals, or to navigate through landmarks on a virtually created map is ubiquitous in face-to-face conversation. We take this observation as a starting point, and we demonstrate that deictic gestures can be analysed on a par with speech by using standard methods from constraint-based grammars such as HPSG. In particular, we use the form of the deictic signal, the form of the speech signal (including its prosodic marking) and their relative temporal performance to derive an integrated multimodal tree that maps to an integrated multimodal meaning. The integration process is constrained via construction rules that rule out ill-formed input. These rules are driven from an empirical corporal study which sheds light on the interaction between speech and deictic gesture.
Original languageEnglish
Title of host publicationProceedings of the 18th International Conference on Head-Driven Phase Structure Grammar (HPSG)
EditorsS. Müller
Place of PublicationSeattle
PublisherCSLI Publications
Pages6-24
Number of pages19
Publication statusPublished - 2011

Fingerprint

Dive into the research topics of 'An HPSG Approach to Synchronous Speech and Deixis'. Together they form a unique fingerprint.

Cite this