Extracting Inverse Kinematics Parameters from Human Motion Data

Taku Kohmura, Atsushi Kuroda, Shunsuke Kudoh, Masaki Hiraga, Yoshihisa Shinagawa

Research output: Contribution to journalArticlepeer-review

Abstract

Even though the increased computational power have enabled users to observe heavy 3D polygonal animations on their desktops, there are still few effective methods for interactive online control of 3D models in cyberspace. In this paper, we propose a new inverse kinematics method by which users can control tree-structured 3D polygonal models with motion data. The key idea is to extract parameters necessary for inverse kinematics control from the motion data: such data include the mass matrix that determines the motion of the redundant joints, and the transform function of end effectors that determine the relative velocity of the controlled segment and the end effectors. Then such parameters are used to calculate the response of the 3D character model when users click and drag the parts of the model with their mouse. Using our method, users can easily edit, warp, and retarget 3D character motions by least effort. Our method is especially effective to handle complex 3D human motion data captured by motion-capture devices.
Translated title of the contributionExtracting Inverse Kinematics Parameters from Human Motion Data
Original languageJapanese
Pages (from-to)143-152
Number of pages10
JournalThe Journal of the Institute of Image Electronics Engineers of Japan
Volume32
Issue number2
DOIs
Publication statusPublished - Feb 2003

Fingerprint Dive into the research topics of 'Extracting Inverse Kinematics Parameters from Human Motion Data'. Together they form a unique fingerprint.

Cite this