A new control paradigm: software-based gesture analysis for music (original) (raw)

ZATLAB: A Gesture Analysis System to Music Interaction

ARTECH 2012 Conference, 2012

The human gesture is an important means of expression and interaction with the world, and has an important role on the perception and interpretation of human communication. Over the years, different approaches have been proposed to capture and study human gestures and movements by various fields of study, namely Human Computer Interaction or Kinesiology (the scientific study of the human motion properties). This paper proposes a new modular system, named Zatlab, that allows to control, in real-time, music generation through expressive gestures, allowing dancers and computer music performers (among others) to explore novel ways of interaction with music and sound creation computer tools. The system is based on realtime, non intrusive, human gesture recognition, which analyzes movement and gesture in terms of low level features (e.g. distance, velocity, acceleration) and high level features (e.g. quantity of movement), and uses machine learning algorithms to map them into various parameters in music generation algorithms, allowing a more semantically and artistically meaningful mapping of human gesture to sound.

Gesture interaction for electronic music performance

2007

This paper describes an approach for a system which analyses an orchestra conductor in real-time, with the purpose of using the extracted information of time pace and expression for an automatic play of a computer-controlled instrument (synthesizer). The system in its final stage will use non-intrusive computer vision methods to track the hands of the conductor. The main challenge is to interpret the motion of the hand/baton/mouse as beats for the timeline.

Gesture-Based Controllers for Computer Instruments

chaimreus.com

Computer-based music brings with it an immense world of possibility that is beginning to be tapped. Once an avant garde novelty, the computer as an instrument has risen to the edge of the mainstream and is begging to be introduced to the layman musician as an accessible sound-making tool. The computer instrument is capable of creating a type of sound that doesn't conform to the traditional paradigms of pitch and harmony; as a result of this there needs to be a new way of controlling algorithmic instruments. I researched and created a musical interface using computer vision algorithms to follow the hand gestures of a performer illuminated in ultraviolet light. The object of my interface is to satisfy the conditions conceived by Gurevich and von Muehlen in their publication on new instruments with a special focus on promoting virtuosic use. A case study was performed using the interface prototype with promising results for future development.

A Technological Platform for Analyzing and Improving Musicians' Sound-Gesture Interactions

HAL (Le Centre pour la Communication Scientifique Directe), 2020

This paper presents a technological platform aiming at analyzing and improving musicians' sound-gesture relationships. The conceptual foundations of the platform were driven by our latest research results, which highlight the close intertwining between sound quality and motor behavior among professional cellists. In particular, the results revealed that the cellists' timbre was consistently altered when they were deprived of their fine postural movements, such as torso swaying or head nodding. In a reciprocal way, we are now interested in investigating how subtle real time deformations of their natural timbre may affect their functional behavior. Besides the instrumental timbre, the platform should also enable to assess the musicians' motion/space intrications by modifying their surrounding acoustic space in an ecological way. After two years of development, we here present a mature architecture of our multimodal tool through the prism of these investigations. Such an architecture is particularly suitable for designing complex sound synthesis protocols involving the musicians' kinesphere perception related to the musical structure.

Gesture capture: Paradigms in interactive music/dance systems

2011

Electronic or digital interactive systems have been experimented with in dance for more than fifty years. The piece Variations V by Merce Cunningham and John Cage, performed in 1965 with dancers interacting with analog electronic sound systems, is one such groundbreaking case (cf. Miller 2001).

White Paper WHP 273 Musical Movements-Gesture Based Audio Interfaces

2013

Recent developments have led to the availability of consumer devices capable of recognising certain human movements and gestures. This paper is a study of novel gesture-based audio interfaces. The authors present two prototypes for interacting with audio/visual experiences. The first allows a user to ‘conduct’ a recording of an orchestral performance, controlling the tempo and dynamic. The paper describes the audio and visual capture of the orchestra and the design and construction of the audio-visual playback system. An analysis of this prototype, based on testing and feedback from a number of users, is also provided. The second prototype uses the gesture tracking algorithm to control a three-dimensional audio panner. This audio panner is tested and feedback from a number of professional engineers is analysed. This work was presented at the 131st Audio Engineering Society Convention in New York in October 2011. Additional key words: audio, ambisonics, interaction,panning,orchestra,...

Online Gesture Analysis and Control of Audio Processing

Springer Tracts in Advanced Robotics, 2011

This chapter presents a general framework for gesture-controlled audio processing. The gesture parameters are assumed to be multi-dimensional temporal profiles obtained from movement or sound capture systems. The analysis is based on machine learning techniques, comparing the incoming dataflow with stored templates. The mapping procedures between the gesture and the audio processing include a specific method we called temporal mapping. In this case, the temporal evolution of the gesture input is taken into account in the mapping process. We describe an example of a possible use of the framework that we experimented with in various contexts, including music and dance performances, music pedagogy and installations.

Gesture Analysis and Control of Audio Processing

2011

This chapter presents a general framework for gesture-controlled audio processing. The gesture parameters are assumed to be multi-dimensional temporal profiles obtained from movement or sound capture systems. The analysis is based on machine learning techniques, comparing the incoming dataflow with stored templates. The mapping procedures between the gesture and the audio processing include a specific method we called temporal mapping. In this case, the temporal evolution of the gesture input is taken into account in the mapping process. We describe an example of a possible use of the framework that we experimented with in various contexts, including music and dance performances, music pedagogy and installations.

Aspects of Gesture in Digital Musical Instrument Design

williambrent.conflations.com

The flexibility of current hardware and software has made the mapping of relationships between a sound's parameters and physical source of control a relatively trivial task. Consequently, the endeavor of sophisticated digital instrument design has been accessible to the creative community for several years, which has resulted in a host of new instruments that explore a variety of physical mappings. The emphasis on physicality exhibited by so-called "gestural controllers" stands in contrast to the practice of conventional laptop performance. While the laptop computer is certainly a digital musical instrument, its associated performance practice is often criticized based on a perceived lack of physicality. This paper examines motivations behind the foregrounding of gesture in computer-based performance. Critical theory and neuroscience research are drawn upon in order to consider ways in which the desire for connections between sound and motion amount to more than mere fascination with virtuosity.