Font Size: a A A

Example-based Approaches for Expressive Locomotion Generation

Posted on:2014-01-08Degree:Ph.DType:Thesis
University:University of California, DavisCandidate:Kim, YejinFull Text:PDF
GTID:2458390008952632Subject:Computer Science
Abstract/Summary:
Our goal is to construct a comprehensive motion framework, especially for human locomotion, that adopts motion capture data as our main parameter source and allows an animator to apply a wide variety of stylistic changes via graphical user interfaces (GUIs). We expect that our framework overcomes many limitations in previous systems by editing the quantitative and qualitative aspects of motions with multiple animation systems, where each of them focuses on editing different motion properties during the synthesis process: interactivity, composition, and timing.;In interactive editing of locomotion style, our system is particularly designed for making stylistic changes via the extracted correlations between the end effectors, that we name drives, and the body movement. When an animator interactively controls the positional data for the wrists, ankles, and center of mass, the system automatically updates the current pose at each frame based on the driver positions and driven orientations using the inverse kinematic (IK) and balance maintaining routines. The overall editing process is controlled by a set of simple and intuitive linear operations on the motion drives or extracted correlations. Thus, an animator can quickly transform the input locomotion into a desired style at interactive speed.;For expressive locomotion generation on an arbitrary path, we provide a system that adopts multiple example clips on a motion path specified by an animator. Significantly, the system only requires a single example of straight-path locomotion for each style modeled and can produce output locomotion for an arbitrary path with arbitrary transitions. Several techniques are applied to automate the overall synthesis: detection of multiple foot-plants from unlabeled examples, estimation of an adaptive blending length for a natural style change, and a post-processing step for enhancing the physical realism of the output animation. Compared with previous approaches, our system requires significantly less data and manual labor, while supporting a large range of styles.;When generating locomotion, it is particularly challenging to adjust the motion's style in a qualitative way. The component-based system is designed for human locomotion composition that drives off a set of example locomotion clips. The distinctive style of each example is analyzed in the form of sub-motion components decomposed from separate body parts via independent component analysis (ICA). During the synthesis process, we use these components as combinatorial ingredients to generate new locomotion sequences that are stylistically different from the example set. Our system is designed for any animator who may not have much knowledge of important locomotion properties, such as the correlations throughout the body. Thus, the proposed system analyzes the examples in a unsupervised manner and synthesizes an output locomotion from a small number of control parameters. Our experimental results show that the system can generate physically plausible locomotion in a desired style at interactive speed.;Timing plays an important role in specifying how a character moves from one pose to another. To effectively capture the timing variations in the example set and to utilize them for style transfer, we propose an editing system that provides separate controls over temporal properties of an input motion via the global and upper-body timing transfers. The global timing transfer focuses on matching the input motion to the body speed of the selected example motion and will contain the overall sense of emotional or physical state observed in the example. On the other hand, the timing transfer in the upper body propagates the sense of movement flow through the torso and arms, which is often referred to as succession. We try to transfer this succession by capturing the relative changes of angle rotation in the upper body joints from the example motion and then apply them to the input motion with a scaled amount. Overall, this system provides an animator temporal edits on locomotion style without destroying spatial details and constraints preserved in the original motion. (Abstract shortened by UMI.).
Keywords/Search Tags:Locomotion, Example, Style, System
Related items