Fast Neural Style Transfer for Motion Data

Taku Komura, Daniel Holden, Ikhsanul Habibie, Ikuo Kusajima

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

We present a fast, efficient technique for performing neural style transfer of human motion data using a feedforward neural network. Typically feedforward neural networks are trained in a supervised fashion - both specifying the input and desired output simultaneously. For tasks such as style transfer this data may not always be available and so a different training method is required. We present a method of training a feedforward neural network making using of a loss network - in this case a convolutional autoencoder trained on a large motion database. This loss network is used to evaluate a number of separate error terms used in training the feedforward neural network. We compute a loss function in the space of the hidden units of the loss network that is based on style difference and motion-specific constraints such as foot sliding, joint lengths, and the trajectory of the character. By back-propagating these errors into the feedforward network we can train it to perform a transformation equivalent to neural style transfer. Using our framework we can transform the style of motion thousands of times faster than previous approaches which use optimization. We demonstrate our system by transforming locomotion into various different styles.
Original languageEnglish
Pages (from-to)42-49
Number of pages8
JournalIEEE Computer Graphics and Applications
Volume37
Issue number4
DOIs
Publication statusPublished - 21 Aug 2017

Keywords / Materials (for Non-textual outputs)

  • motion capture
  • deep learning
  • style transfer
  • machine learning

Fingerprint

Dive into the research topics of 'Fast Neural Style Transfer for Motion Data'. Together they form a unique fingerprint.

Cite this