Abstract / Description of output
Even though the increased computational power have enabled users to observe heavy 3D polygonal animations on their desktops, there are still few effective methods for interactive online control of 3D models in cyberspace. In this paper, we propose a new inverse kinematics method by which users can control tree-structured 3D polygonal models with motion data. The key idea is to extract parameters necessary for inverse kinematics control from the motion data: such data include the mass matrix that determines the motion of the redundant joints, and the transform function of end effectors that determine the relative velocity of the controlled segment and the end effectors. Then such parameters are used to calculate the response of the 3D character model when users click and drag the parts of the model with their mouse. Using our method, users can easily edit, warp, and retarget 3D character motions by least effort. Our method is especially effective to handle complex 3D human motion data captured by motion-capture devices.
Translated title of the contribution | Extracting Inverse Kinematics Parameters from Human Motion Data |
---|---|
Original language | Japanese |
Pages (from-to) | 143-152 |
Number of pages | 10 |
Journal | The Journal of the Institute of Image Electronics Engineers of Japan |
Volume | 32 |
Issue number | 2 |
DOIs | |
Publication status | Published - Feb 2003 |