Research in the field of embodied music cognition has shown the importance of coupled processes of body activity (action) and multimodal representations of these actions (perception) in how music is processed. Technologies in the field of human–computer interaction (HCI) provide excellent means to intervene into, and extend, these coupled action-perception processes. In this article this model is applied to a concrete HCI application, called the “Conducting Master.” The application facilitates multiple users to interact in real time with the system in order to explore and learn how musical meter can be articulated into body movements (i.e., meter-mimicking gestures). Techniques are provided to model and automatically recognize these gestures in order to provide multimodal feedback streams back to the users. These techniques are based on template-based methods that allow approaching meter-mimicking gestures explicitly from a spatiotemporal account. To conclude, some concrete setups are presented in which the functionality of the Conducting Master was evaluated.
|Number of pages||17|
|Journal||International Journal of Human-Computer Interaction|
|Publication status||Published - 3 Jul 2013|