Sonification of Movement Qualities – a Case Study on Fluidity (original) (raw)

Interactive sonification of a fluid dance movement: an exploratory study

Journal on Multimodal User Interfaces, 2018

In this paper we present three different experiments designed to explore sound properties associated with fluid movement: (1) an experiment in which participants adjusted parameters of a sonification model developed for a fluid dance movement, (2) a vocal sketching experiment in which participants sketched sounds portraying fluid versus nonfluid movements, and (3) a workshop in which participants discussed and selected fluid versus nonfluid sounds. Consistent findings from the three experiments indicated that sounds expressing fluidity generally occupy a lower register and has less high frequency content, as well as a lower bandwidth, than sounds expressing nonfluidity. The ideal sound to express fluidity is continuous, calm, slow, pitched, reminiscent of wind, water or an acoustic musical instrument. The ideal sound to express nonfluidity is harsh, non-continuous, abrupt, dissonant, conceptually associated with metal or wood, unhuman and robotic. Findings presented in this paper can be used as design guidelines for future applications in which the movement property fluidity is to be conveyed through sonification.

Sonification of fluidity - An exploration of perceptual connotations of a particular movement feature

2016

In this study we conducted two experiments in order to investigate potential strategies for sonification of the expressive movement quality "fluidity" in dance: one perceptual rating experiment (1) in which five different sound models were evaluated on their ability to express fluidity, and one interactive experiment (2) in which participants adjusted parameters for the most fluid sound model in (1) and performed vocal sketching to two video recordings of contemporary dance. Sounds generated in the fluid condition occupied a low register and had darker, more muffled, timbres compared to the non-fluid condition, in which sounds were characterized by a higher spectral centroid and contained more noise. These results were further supported by qualitative data from interviews. The participants conceptualized fluidity as a property related to water, pitched sounds, wind, and continuous flow; non-fluidity had connotations of friction, struggle and effort. The biggest conceptual distinction between fluidity and non-fluidity was the dichotomy of "nature" and "technology", "natural" and "unnatural", or even "human" and "inhuman". We suggest that these distinct connotations should be taken into account in future research focusing on the fluidity quality and its corresponding sonification.

A serious games platform for validating sonification of human full-body movement qualities

In this paper we describe a serious games platfrom for validating sonification of human full-body movement qualities. This platform supports the design and development of serious games aiming at validating (i) our techniques to measure expressive movement qualities, and (ii) the mapping strategies to translate such qualities in the auditory domain, by means of interactive sonification and active music experience. The platform is a part of a more general framework developed in the context of the EU ICT H2020 DANCE " Dancing in the dark " Project n.645553 that aims at enabling the perception of nonverbal artistic whole-body experiences to visual impaired people.

Bringing musicality to movement sonification

Proceedings of the 8th Audio Mostly Conference on - AM '13, 2013

In this paper we describe a novel approach to the sonification of crawl swim movement. The design method integrates task and data analysis from a sport science perspective with subjective experience of swimmers and swimming coaches, and strongly relies on the skills of musicians in order to define the basic sonic design. We report on the design process, and on the implementation and evaluation of a first prototype.

Movement To Emotions To Music: Using Whole Body Emotional Expression As An Interaction For Electronic Music Generation

2012

The augmented ballet project aims at gathering research from several fields and directing them towards a same application case: adding virtual elements (visual and acoustic) to a dance live performance, and allowing the dancer to interact with them. In this paper, we describe a novel interaction that we used in the frame of this project: using the dancer's movements to recognize the emotions he expresses, and use these emotions to generate musical audio flows evolving in real-time. The originality of this interaction is threefold. First, it covers the whole interaction cycle from the input (the dancer's movements) to the output (the generated music). Second, this interaction isn't direct but goes through a high level of abstraction: dancer's emotional expression is recognized and is the source of music generation. Third, this interaction has been designed and validated through constant collaboration with a choreographer, culminating in an augmented ballet performance...

Vocalizing dance movement for interactive sonification of laban effort factors

Proceedings of the 2014 conference on Designing interactive systems - DIS '14, 2014

We investigate the use of interactive sound feedback for dance pedagogy based on the practice of vocalizing while moving. Our goal is to allow dancers to access a greater range of expressive movement qualities through vocalization. We propose a methodology for the sonification of Effort Factors, as defined in Laban Movement Analysis, based on vocalizations performed by movement experts. Based on the experiential outcomes of an exploratory workshop, we propose a set of design guidelines that can be applied to interactive sonification systems for learning to perform Laban Effort Factors in a dance pedagogy context.

Interactive sonification of emotionally expressive gestures by means of music performance

Proc. of ISon, 2010

This study presents a procedure for interactive sonification of emotionally expressive hand and arm gestures by affecting a musical performance in real-time. Three different mappings are described that translate accelerometer data to a set of parameters that control the expressiveness of the performance by affecting tempo, dynamics and articulation. The first two mappings, tested with a number of subjects during a public event, are relatively simple and were designed by the authors using a top-down approach. According to user feedback, they were not intuitive and limited the usability of the software. A bottom-up approach was taken for the third mapping: a Classification Tree was trained with features extracted from gesture data from a number of test subject who were asked to express different emotions with their hand movements. A second set of data, where subjects were asked to make a gesture that corresponded to a piece of expressive music they just listened to, were used to validate the model. The results were not particularly accurate, but reflected the small differences in the data and the ratings given by the subjects to the different performances they listened to.

Sound response to physicality. Artistic expressions of movement sonification

2022

This paper introduces the reader to an extraordinarily broad subject of movement sonification, namely how to create, participate in and understand the phenomenon, with special emphasis on human perception and proprioception system. Particular attention is also paid to technological issues, presented and described using the example of IMU technology.

Towards an Interactive Model-Based Sonification of Hand Gesture for Dance Performance

2020

This paper presents an ongoing research on hand gesture interactive sonification in dance performances. For this purpose, a conceptual framework and a multilayered mapping model issued from an experimental case study will be proposed. The goal of this research is twofold. On the one hand, we aim to determine action-based perceptual invariants that allow us to establish pertinent relations between gesture qualities and sound features. On the other hand, we are interested in analysing how an interactive model-based sonification can provide useful and effective feedback for dance practitioners. From this point of view, our research explicitly addresses the convergence between the scientific understandings provided by the field of movement sonification and the traditional know-how developed over the years within the digital instrument and interaction design communities. A key component of our study is the combination between physically-based sound synthesis and motion features analysis....

Sonification of Coordinated Body Movements

2011

This paper introduces a new hard-/ and software system for the interactive sonification of sports movement involving armand leg movements. Two different sonifications are designed to convey rhythmical patterns that become auditory gestalt so that listeners can identify features of the underlying coordinated movement. The Sonification is designed for the application to enable visually impaired users to participate in aerobics exercises, and also to enhance the perception of movements for sighted participants, which is useful for instance if the scene is occluded or the head posture is incompatible with the observation of the instructor or fitness professional who shows the practices in parallel. Furthermore, the system allows to monitor fine couplings in arm/leg coordination while jogging, as auditory feedback may help stabilizing the movement pattern. We present the sensing system, two sonification designs, and interaction examples that lead to coordination-specific sound gestalts. ...