Sonification of Gestures Using Specknets

Vangelis Lympouridis, Martin Parker, Alexander Young, D K Arvind

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

This paper introduces a novel approach to gesture recognition for interactive virtual instruments. The method is based on the tracking of body postures and movement which is achieved by a wireless network of Orient-2 specks strapped to parts of the body. This approach is in contrast to camera-based methods which require a degree of infrastructure support. This paper describes the rationale underlying the method of sonification from gestures, addressing issues such as disembodiment and virtuality. A working system is described together with the method for interpreting the gestures as sounds in the MaxMSP tool.
Original languageEnglish
Title of host publicationProceedings SMC'07, 4th Sound and Music Computing Conference
Pages382-385
Number of pages4
Publication statusPublished - 2007

Fingerprint

Dive into the research topics of 'Sonification of Gestures Using Specknets'. Together they form a unique fingerprint.

Cite this