New Interfaces for Spatial Musical Expression (original) (raw)
Related papers
Non-Rigid Musical Interfaces: Exploring Practices, Takes, and Future Perspective
2020
Non-rigid interfaces allow for exploring new interactive paradigms that rely on deformable input and shape change, and whose possible applications span several branches of human-computer interaction (HCI). While extensively explored as deformable game controllers, bendable smartphones, and shape-changing displays, nonrigid interfaces are rarely framed in a musical context, and their use for composition and performance is rather sparse and unsystematic. With this work, we start a systematic exploration of this relatively uncharted research area, by means of (1) briefly reviewing existing musical interfaces that capitalize on deformable input, and (2) surveying 11 among experts and pioneers in the field about their experience with and vision on non-rigid musical interfaces. Based on experts’ input, we suggest possible next steps of musical appropriation with deformable and shape-changing technologies. We conclude by discussing how cross-overs between NIME and HCI research will benefit...
Challenges in Designing New Interfaces for Musical Expression
Lecture Notes in Computer Science, 2014
The new interfaces are changing the way we interact with computers. In the musical context, those new technologies open a wide range of possibilities in the creation of New Interfaces for Musical Expression (NIME). Despite 10 years of research in NIME, it is hard to find artifacts that have been widely or convincingly adopted by musicians. In this paper, we discuss some NIME design challenges, highlighting particularities related to the digital and musical nature of these artifacts, such as virtuosity, cultural elements, context of use, creation catalysis, success criteria, adoption strategy, etc. With these challenges, we aim to call attention for the intersection of music, computing and design, which can be an interesting area for people working on product design and interaction design.
Requirements on Kinaesthetic Interfaces for Spatially Interactive Sonic Art
This paper documents the requirements on tracking technology for spatially interactive sonic arts. We do this by comparing our theorised notion of an ideal kinaesthetic interface to, firstly, the current results of an ongoing online survey and, secondly, the results of our ongoing Workshop on Music, Space & Interaction (MS&I). In MS&I we research the affordances of existing and hypothetical technology to enhance and facilitate spatial interactivity. We give both qualitative and quantitative recommendations for design. While underlining the specific requirements for sonic art in respect to its aural nature, we discuss how and why the requirements elicited from our research can be applied to spatial interactivity in general.
A spatial interface for audio and music production
… Conference on Digital Audio …, 2006
In an effort to find a better suited interface for musical performance, a novel approach has been discovered and developed. At the heart of this approach is the concept of physical interaction with sound in space, where sound processing occurs at various 3-D locations and sending sound signals from one area to another is based on physical models of sound propagation. The control is based on a gestural vocabulary that is familiar to users, involving natural spatial interaction such as translating, rotating, and pointing in 3-D. This research presents a framework to deal with realtime control of 3-D audio, and describes how to construct audio scenes to accomplish various musical tasks. The generality and effectiveness of this approach has enabled us to reimplement several conventional applications, with the benefit of a substantially more powerful interface, and has further led to the conceptualization of several novel applications.
A Survey on the Use of 2D Touch Interfaces for Musical Expression
Proceedings of the International Conference on New Interfaces for Musical Expression, 2020
Expressive 2D multi-touch interfaces have in recent years moved from research prototypes to industrial products, from repurposed generic computer input devices to controllers specially designed for musical expression. A host of practicioners use this type of devices in many different ways, with different gestures and sound synthesis or transformation methods. In order to get an overview of existing and desired usages, we launched an on-line survey that collected 37 answers from practicioners in and outside of academic and design communities. In the survey we inquired about the participants' devices, their strengths and weaknesses, the layout of control dimensions, the used gestures and mappings, the synthesis software or hardware and the use of audio descriptors and machine learning. The results can inform the design of future interfaces, gesture analysis and mapping, and give directions for the need and use of machine learning for user adaptation.
Divergence Press (Center for Research in New Music at the University of Huddersfield)
This article will explore practical and aesthetic questions concerning spatial music performance by interrogating new developments within an emerging hyperinstrumental practice. The performance system is based on an electric guitar with individuated audio outputs per string and multichannel loudspeaker array. A series of spatial music mapping strategies will explore in-kind relationships between a formal melodic syntax model and an ecological flocking simulator, exploiting broader notions of embodiment underpinning the metaphorical basis for the experience and understanding of musical structure. The extension and refinement of this system has been based on a combination of practice-led and theoretical developments. The resulting mapping strategies will forge new gestural narratives between physical and figurative gestural planes, culminating in a responsive, bodily based, and immersive spatial music performance practice. The operation of the performance system is discussed in relation to supporting audiovisual materials.
Pointing Fingers: Using Multiple Direct Interactions with Visual Objects to Perform Music
2003
In this paper, we describe a new interface for musical performance, using the interaction with a graphical user interface in a powerful manner: the user directly touches a screen where graphical objects are displayed and can use several fingers simultaneously to interact with the objects. The concept of this interface is based on the superposition of the gesture spatial place and the visual feedback spatial place; i t gives the impression that the graphical objects are real. This concept enables a huge freedom in designing interfaces. The gesture device we have created gives the position of four fingertips using 3D sensors and the data is performed in the Max/MSP environment. We have realized two practical examples of musical use of such a device, using Photosonic Synthesis and Scanned Synthesis.
Gesture Control of Sound Spatialization for Live Musical Performance
Gesture-Based Human-Computer Interaction and Simulation, 2009
This paper presents the development of methods for gesture control of sound spatialization. It provides a comparison of seven popular software spatialization systems from a control point of view, and examines human-factors issues relevant to gesture control. An effort is made to reconcile these two design-and parameter-spaces, and draw useful conclusions regarding likely successful mapping strategies. Lastly, examples are given using several different gesture-tracking and motion capture systems controlling various parameters of the spatialization system.
Creating tangible spatial-musical images from physical performance gestures
2020
Electroacoustic music has a longstanding relationship with gesture and space. This paper marks the start of a project investigating acousmatic spatial imagery, real gestural behaviour and ultimately the formation of tangible acousmatic images. These concepts are explored experimentally using motion tracking in a source-sound recording context, interactive parameter-mapping sonification in threedimensional high-order ambisonics, composition and performance. The spatio-musical role of physical actions in relation to instrument excitation is used as a point of departure for embodying physical spatial gestures in the creative process. The work draws on how imagery for music is closely linked with imagery for music-related actions.
ToCoPlay: Graphical Multi-touch Interaction for Composing and Playing Music
Human-Computer Interaction …, 2011
With the advent of electronic music and computers, the human-sound interface is liberated from the specific physical constraints of traditional instruments, which means that we can design musical interfaces that provide arbitrary mappings between human actions and sound generation. This freedom has resulted in a wealth of new tools for electronic music generation that expand the limits of expression, as exemplified by projects such as Reactable and Bricktable. In this paper we present ToCoPlay, an interface that further explores the design space of collaborative, multi-touch music creation systems. ToCoPlay is unique in several respects: it allows creators to dynamically transition between the roles of composer and performer, it takes advantage of a flexible spatial mapping between a musical piece and the graphical interface elements that represent it, and it applies current and traditional interface interaction techniques for the creation of music.