Emulating human perception of motion similarity

Jeff K. T. Tang, Howard Leung, Taku Komura, Hubert P. H. Shum

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

Evaluating the similarity of motions is useful for motion retrieval, motion blending, and performance analysis of dancers and athletes. Euclidean distance between corresponding joints has been widely adopted in measuring similarity of postures and hence motions. However, such a measure does not necessarily conform to the human perception of motion similarity. In this paper, we propose a new similarity measure based on machine learning techniques. We make use of the results of questionnaires from subjects answering whether arbitrary pairs of motions appear similar or not. Using the relative distance between the joints as the basic features, we train the system to compute the similarity of arbitrary pair of motions. Experimental results show that our method outperforms methods based on Euclidean distance between corresponding joints. Our method is applicable to content-based motion retrieval of human motion for large-scale database systems. It is also applicable to e-Learning systems which automatically evaluates the performance of dancers and athletes by comparing the subjects' motions with those by experts. Copyright © 2008 John Wiley & Sons, Ltd.
Original languageEnglish
Pages (from-to)211-221
Number of pages11
JournalComputer Animation and Virtual Worlds
Volume19
Issue number3-4
DOIs
Publication statusPublished - Aug 2008

Keywords / Materials (for Non-textual outputs)

  • 3D human motion similarity, human perception, pattern recognition

Fingerprint

Dive into the research topics of 'Emulating human perception of motion similarity'. Together they form a unique fingerprint.

Cite this