Abstract
Existing synthesis methods for closely interacting virtual characters relied on user-specified constraints such as the reaching positions and the distance between body parts. In this paper, we present a novel method for synthesizing new interacting motion by composing two existing interacting motion samples without the need to specify the constraints manually. Our method automatically detects the type of interactions contained in the inputs and determines a suitable timing for the interaction composition by analyzing the spacetime relationships of the input characters. To preserve the features of the inputs in the synthesized interaction, the two inputs will be aligned and normalized according to the relative distance and orientation of the characters from the inputs. With a linear optimization method, the output is the optimal solution to preserve the close interaction of two characters and the local details of individual character behavior. The output animations demonstrated that our method is able to create interactions of new styles that combine the characteristics of the original inputs.
Original language | English |
---|---|
Pages (from-to) | 41-50 |
Journal | Computer Graphics Forum |
Volume | 32 |
Issue number | 7 |
DOIs | |
Publication status | Published - 25 Nov 2013 |