Style Translation for Human Motion

Eugene Hsu, Kari Pulli, Jovan Popovic / SIGGRAPH 2005

Paper (1.2M PDF) / Video (42M MP4) / Extra (29M MP4) / Data (146M ZIP) / Talk (47M ZIP)

Style translation is the process of transforming an input motion into a new style while preserving its original content. This problem is motivated by the needs of interactive applications, which require rapid processing of captured performances. Our solution learns to translate by analyzing differences between performances of the same content in input and output styles. It relies on a novel correspondence algorithm to align motions, and a linear time-invariant model to represent stylistic differences. Once the model is estimated with system identification, our system is capable of translating streaming input with simple linear operations at each frame.