We propose a novel methodology for learning and synthesising whole classes of high dimensional movements from a limited set of demonstrated examples that satisfy some underlying 'latent' low dimensional task constraints. We employ non-linear dimensionality reduction to extract a canonical latent space that captures some of the essential topology of the unobserved task space. In this latent space, we identify suitable parametrisation of movements with control policies such that they are easily modulated to generate novel movements from the same class and are robust to perturbations. We evaluate our method on controlled simulation experiments with simple robots (reaching and periodic movement tasks) as well as on a data set of very high-dimensional human (punching) movements. We verify that we can generate a continuum of new movements from the demonstrated class from only a few examples in both robotic and human data.
|Title of host publication||From Animals to Animats 10|
|Subtitle of host publication||10th International Conference on Simulation of Adaptive Behavior, SAB 2008, Osaka, Japan, July 7-12, 2008. Proceedings|
|Editors||Minoru Asada, John C.T. Hallam, Jean-Arcady Meyer, Jun Tani|
|Number of pages||11|
|Publication status||Published - 2008|
|Name||Lecture Notes in Computer Science|