Informed Use of Motion Synthesis Methods (original) (raw)
Related papers
Real Time Animation of Virtual Humans: A Trade-off Between Naturalness and Control
2010
Virtual humans are employed in many interactive applications using 3D virtual environments, including (serious) games. The motion of such virtual humans should look realistic (or 'natural') and allow interaction with the surroundings and other (virtual) humans. Current animation techniques differ in the trade-off they offer between motion naturalness and the control that can be exerted over the motion. We show mechanisms to parameterize, combine (on different body parts) and concatenate motions generated by different animation techniques. We discuss several aspects of motion naturalness and show how it can be evaluated. We conclude by showing the promise of combinations of different animation paradigms to enhance both naturalness and control.
Exploiting Motion Capture for Virtual Human Animation
2010
Motion capture (mocap) provides highly precise data of human movement which can be used for empirical analysis and virtual human animation. In this paper, we describe a corpus that has been collected for the purpose of modelling movement in a dyadic conversational context. We describe the technical setup, scenarios and challenges involved in capturing the corpus, and present ways of annotating and visualizing the data. For visualization we suggest the techniques of motion trails and animated recreation. We have incorporated these motion capture visualization techniques as extensions to the ANVIL tool and into a procedural animation system, and show a first attempt at automated analysis of the data (handedness detection).
A Framework for Motion Based Bodily Enaction with Virtual Characters
Lecture Notes in Computer Science, 2011
We propose a novel methodology for authoring interactive behaviors of virtual characters. Our approach is based on enaction, which means a continuous two-directional loop of bodily interaction. We have implemented the case of two characters, one human and one virtual, who are separated by a glass wall and can interact only through bodily motions. Animations for the virtual character are based on captured motion segments and descriptors for the style of motions that are automatically calculated from the motion data. We also present a rule authoring system that is used for generating behaviors for the virtual character. Preliminary results of an enaction experiment with an interview show that the participants could experience the different interaction rules as different behaviors or attitudes of the virtual character.
Guest Editors' Introduction: Computer Animation for Virtual Humans
IEEE Computer Graphics and Applications, 1998
A dvances in computer animation techniques have spurred increasing levels of realism and movement in virtual characters that closely mimic physical reality. Increases in computational power and control methods enable the creation of 3D virtual humans for real-time interactive applications. 1 Artificial intelligence techniques and autonomous agents give computer-generated characters a life of their own and let them interact with other characters in virtual worlds. Developments and advances in networking and virtual reality (VR) let multiple participants share virtual worlds and interact with applications or each other.
Human motion for virtual people
2004
While computer animation is currently widely used to create characters in games, films, and various other applications, techniques such as motion capture and keyframing are still relatively expensive. Automatic acquisition of secondary motion and/or motion prototyping using machine learning might be a solution to this problem. Our paper presents an application of the Q-learning algorithms to generate action sequences for animated characters. The techniques can be used in both deterministic and nondeterministic environments to generate actions which can later be incorporated into more complex animation sequences. The paper presents an application of both deterministic and non-deterministic updates of the Q-learning algorithm to automatic acquisition of motion. Results obtained from the learning system are also compared to human motion and conclusions are drawn.
Motion control of synthetic actors: an integrated view of human animation
1989
This paper explains the ideal concepts that must be part of a system for synthetic actor animation. After a brief introduction to the role of synthetic actors, five major steps to the motion control of these actors are discussed: positional constraints and inverse kinematics, dynamics, impact of the environment, task planning and behavioral animation.
Computer Animation for Virtual Humans
IEEE Computer Graphics and Applications, 1998
A dvances in computer animation techniques have spurred increasing levels of realism and movement in virtual characters that closely mimic physical reality. Increases in computational power and control methods enable the creation of 3D virtual humans for real-time interactive applications. 1 Artificial intelligence techniques and autonomous agents give computer-generated characters a life of their own and let them interact with other characters in virtual worlds. Developments and advances in networking and virtual reality (VR) let multiple participants share virtual worlds and interact with applications or each other.
From motion capture to real-time character animation
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2008
This paper describes a framework for animating virtual characters in real-time environments thanks to motion capture data. In this paper, we mainly focus on the adaptation of motion capture data to the virtual skeleton and to its environment. To speed-up this real-time process we introduce a morphology-independent representation of motion. Based on this representation, we have redesigned the methods for inverse kinematics and kinetics so that our method can adapt the motion thanks to spacetime constraints, including a control of the center of mass position. If the resulting motion doesn't satisfy general mechanical laws (such as maintaining the angular momentum constant during aerial phases) the current pose is corrected. External additional forces can also be considered in the dynamic correction module so that the character automatically bend his hips when pushing heavy objects, for example. All this process is performed in real-time.
Motion capture based motion analysis and motion synthesis for human-like character animation
2009
Motion capture technology is recognised as a standard tool in the computer animation pipeline. It provides detailed movement for animators; however, it also introduces problems and brings concerns for creating realistic and convincing motion for character animation. In this thesis, the post-processing techniques are investigated that result in realistic motion generation. Anumber of techniques are introduced that are able to improve the quality of generated motion from motion capture data, especially when integrating motion transitions from different motion clips. The presented motion data reconstruction technique is able to build convincing realistic transitions from existing motion database, and overcome the inconsistencies introduced by traditional motion blending techniques. It also provides a method for animators to re-use motion data more efficiently. Along with the development of motion data transition reconstruction, the motion capture data mapping technique was investigated...
Production and playback of human figure motion for visual simulation
We describe a system for o -line production and real-time playback o f motion for articulated human gures in 3D virtual environments. The key notions are (1) the logical storage of full-body motion in posture graphs, which p r o vides a simple motion access method for playback, and (2) mapping the motions of higher DOF gures to lower DOF gures using slaving to provide human models at several levels of detail, both in geometry and articulation, for later playback. We present our system in the context of a simple problem: Animating human gures in a distributed simulation, using DIS protocols for communicating the human state information. We also discuss several related techniques for real-time animation of articulated gures in visual simulation.