A Physically Based Motion Retargeting Filter
of motion capture as commonplace technique for generating realistic animation
of human-like character
has heightened interest in methods for modifying or retargeting a captured motion to different characters.
In this thesis, we present a novel constraint-based motion editing technique. On the basis of animator-specified kinematic and dynamic constraints, the method converts a given captured or animated motion to a physically plausible motion. In contrast to previous methods using spacetime optimization, we cast the motion editing problem as a constrained state estimation problem based on the per-frame Kalman filter framework. The method works as a filter that sequentially scans the input motion to produce a stream of output motion frames at a stable interactive rate.
Our motion editing algorithm consists of two consecutive filters. the unscented Kalman filter estimates an optimal pose for
the current frame that conforms to the given constraints, and feed the result to the least-squares filter. Then, the least-squares filter resolves the inter-frame inconsistency introduced by the Kalman filter due to the independent handling of the position, velocity, and acceleration.
Animators can tune several filter parameters to adjust to different motions, or can turn the constraints on or off based on their contributions to the final result. One particularly appealing feature of the proposed technique is that animators find it very scalable and intuitive. Experiments on various examples show that the technique processes the motions of a human with 54 degrees of freedom at about 150 fps when only kinematic constraints are applied, and at about 10 fps when both kinematic and dynamic constraints are applied.
Movie files - One Long Animation with Voiceover