Abstract
Existing kinematic research on orchestral conducting movement contributes to an understanding of conductor beat-tracking and delivery of performance dynamics. Methodologically, such movement cues have been treated as distinct, isolated events. Yet - as practicing musicians and music pedagogues know - conductors’ expressive instructions are highly flexible and dependent on the musical context. We seek to demonstrate an approach to search for effective descriptors to express musical features in conducting movement in a valid music context, and to extract complex expressive semantics from elementary conducting kinematic variations. This study therefore proposes a multi-task learning model to jointly identify dynamic, articulation, and phrasing cues from conducting kinematics. A professional conducting movement dataset is compiled using a high-resolution motion capture system. The ReliefF algorithm is applied to select significant features from conducting movement, and recurrent neural network (RNN) is implemented to identify multiple movement cues. The experimental results disclose key elements in conducting movement which communicate musical expressiveness; the results also highlight the advantage of multi-task learning in the complete musical context over single-task learning. To the best of our knowledge, this is the first attempt to use RNN to explore multiple semantic expressive cuing in conducting movement kinematics.
Original language | English |
---|---|
Publication status | Published - 4 Nov 2019 |
Event | 20th International Society for Music Information Retrieval: ISMIR2019 - Delft University of Technology, Delft, Netherlands Duration: 4 Nov 2019 → 8 Nov 2019 https://ismir2019.ewi.tudelft.nl/ |
Conference
Conference | 20th International Society for Music Information Retrieval |
---|---|
Abbreviated title | ISMIR2019 |
Country/Territory | Netherlands |
City | Delft |
Period | 4/11/19 → 8/11/19 |
Internet address |