Virtual dance and music environment using motion capture (original) (raw)

Gesture capture: Paradigms in interactive music/dance systems

2011

Electronic or digital interactive systems have been experimented with in dance for more than fifty years. The piece Variations V by Merce Cunningham and John Cage, performed in 1965 with dancers interacting with analog electronic sound systems, is one such groundbreaking case (cf. Miller 2001).

“Topos” toolkit for Pure Data: exploring the spatial features of dance gestures for interactive musical applications

The dissemination of multimedia technologies in the information societies created an interesting scenario: the unprecedented access to a diverse combination of music, image, video and other media streams raised demands for more interactive and expressive multimodal experiences. How to support the demands for richer music-movement interactions? How to translate spatiotemporal qualities of hu man movement into relevant features for music making and sound design? In this paper we study the realtime interaction between choreographic movement in space and music, implemented by means of a collection of tools called To- pos. The tools were developed in the Pure Data platform and provide a number of feature descriptions that help to map the quality of dance gestures in space to music and other media. The features are based concepts found in the literature of cognition and dance, which improves the computational representation of dance gestures in space. The concepts and techniques presented in the study introduce new problems and new possibilities for multimedia applications involving dance and music interaction.

Using Music and Motion Analysis to Construct 3D Animations and Visualizations

2015

This paper presents a study into music analysis, motion analysis and the integration of music and motion to form natural human motion in a virtual environment. Motion capture data is extracted to generate a motion library; this places the digital motion model at a fixed posture. The first step in this process is to configure the motion path curve for the database and calculate the possibility that two motions were sequential through the use of a computational algorithm. Every motion is then analyzed for the next possible smooth movement to connect to, and at the same time, an interpolation method is used to create the transitions between motions to enable the digital motion models to move fluently. Lastly, a searching algorithm sifts for possible successive motions from the motion path curve according to the music tempo. It was concluded that the higher ratio of rescaling a transition, the lower the degree of natural motion.

ZATLAB: A Gesture Analysis System to Music Interaction

ARTECH 2012 Conference, 2012

The human gesture is an important means of expression and interaction with the world, and has an important role on the perception and interpretation of human communication. Over the years, different approaches have been proposed to capture and study human gestures and movements by various fields of study, namely Human Computer Interaction or Kinesiology (the scientific study of the human motion properties). This paper proposes a new modular system, named Zatlab, that allows to control, in real-time, music generation through expressive gestures, allowing dancers and computer music performers (among others) to explore novel ways of interaction with music and sound creation computer tools. The system is based on realtime, non intrusive, human gesture recognition, which analyzes movement and gesture in terms of low level features (e.g. distance, velocity, acceleration) and high level features (e.g. quantity of movement), and uses machine learning algorithms to map them into various parameters in music generation algorithms, allowing a more semantically and artistically meaningful mapping of human gesture to sound.