Analyzing journeys in sound: usability of graphical interpolators for sound design (original) (raw)
Related papers
A Journey in (Interpolated) Sound: Impact of Different Visualizations in Graphical Interpolators
Audio Mostly 2019, 2019
Graphical interpolation systems provide a simple mechanism for the control of sound synthesis systems by providing a level of abstraction above the parameters of the synthesis engine, allowing users to explore different sounds without awareness of the synthesis details. While a number of graphical interpolator systems have been developed over many years, with a variety of user-interface designs, few have been subject to user-evaluations. We present the testing and evaluation of alternative visualizations for a graphical interpolator in order to establish if the visual feedback provided through the interface, aids the navigation and identification of sounds with the system. The testing took the form of comparing the users' mouse traces, showing the journey they made through the interpolated sound space when different visual interfaces were used. Sixteen participants took part and a summary of the results is presented, showing that the visuals provide users with additional cues that lead to better interaction with the interpolators. CCS CONCEPTS • Human-centered computing~Visualization design and evaluation methods • Human-centered computing~Usability testing • Human-centered computing~Empirical studies in HCI
A Journey in (Interpolated) Sound
Proceedings of the 14th International Audio Mostly Conference: A Journey in Sound, 2019
Graphical interpolation systems provide a simple mechanism for the control of sound synthesis systems by providing a level of abstraction above the parameters of the synthesis engine, allowing users to explore different sounds without awareness of the synthesis details. While a number of graphical interpolator systems have been developed over many years, with a variety of user-interface designs, few have been subject to user-evaluations. We present the testing and evaluation of alternative visualizations for a graphical interpolator in order to establish if the visual feedback provided through the interface, aids the navigation and identification of sounds with the system. The testing took the form of comparing the users' mouse traces, showing the journey they made through the interpolated sound space when different visual interfaces were used. Sixteen participants took part and a summary of the results is presented, showing that the visuals provide users with additional cues that lead to better interaction with the interpolators.
Analysis and Evaluation of Visual Cues in Graphical Interpolators
Proceedings of the 19th Sound and Music Computing Conference, June 5-12th, 2022, Saint-Étienne (France), 2022
Graphical interpolators provide a simple mechanism for synthesis-based sound design by offering a level of abstraction above the synthesis parameters. These systems supply users with two sensory modalities in the form of sonic output from the synthesis engine and visual feedback from the interface. A number of graphical interpolator systems have been developed over the years that provide users with different visual cues, via the graphical display. This study compares user interactions with six interpolation systems that have alternative visualizations, in order to investigate the impact that the interface's different visual cues have on the process of locating sounds within the space. We also present a dimension space analysis of the interpolators and compare this with the user studies to explore its predictive potential in evaluating designs. The outcomes from our study help to better understand design considerations for graphical interpolators and will inform future designs.
Sound & Music Computing Conference 2019, 2019
This paper presents a framework that supports the development and evaluation of graphical interpolated parameter mapping for the purpose of sound design. These systems present the user with a graphical pane, usually two-dimensional, where synthesizer presets can be located. Moving an interpolation point cursor within the pane will then create new sounds by calculating new parameter values, based on the cursor position and the interpolation model used. The exploratory nature of these systems lends itself to sound design applications, which also have a highly exploratory character. However, populating the interpolation space with "known" preset sounds allows the parameter space to be constrained, reducing the design complexity otherwise associated with synthesizer-based sound design. An analysis of previous graphical interpolators is presented and from this a framework is formalized and tested to show its suit-ability for the evaluation of such systems. The framework has then been used to compare the functionality of a number of systems that have been previously implemented. This has led to a better understanding of the different sonic outputs that each can produce and highlighted areas for further investigation.
VISUALIZING AND CONTROLLING SOUND WITH GRAPHICAL INTERFACES
Audio Engineering Society 41st Conference: Audio for Games (2011).
Developments in abstract representations of sound from the field of computer music have potential applications for designers of musical computer games. Research in cognition has identified correlations in the perceptions of visual objects and audio events; -experiments show that test subjects associate certain qualities of graphical shapes with those of vocal sounds. Such 'sound symbolism' has been extended to non-vocal sounds and this paper describes attempts to exploit this and other phenomenon in the visualization of audio. The ideas are expanded upon to propose control for sound synthesis through the manipulation of virtual shapes. Mappings between parameters in the auditory and visual feedback modes are discussed. An exploratory user test examines the technique using a prototype system.
MorphOSC- A Toolkit for Building Sound Control GUIs with Preset Interpolation in Processing
Linux Audio Conference 2013, May 9-12 @ IEM, Graz, Austria (Forthcoming)
MorphOSC is a new toolkit for building graphical user interfaces for the control of sound using morphing between parameter presets. It uses the multidimensional interpolation space paradigm seen in some other systems, but hitherto unavailable as open-source software in the form presented here. The software is delivered as a class library for the Processing Development Environment and is cross-platform for desktop computers and Android mobile devices. This paper positions the new library within the context of similar software, introduces the main features of the initial code release and details future work on the project.
A Visualization Tool to Explore Interactive Sound Installations
NIME 2021
This paper presents a theoretical framework for describing interactive sound installations, along with an interactive database, on a web application, for visualizing various features of sound installations. A corpus of 195 interactive sound installations was reviewed to derive a taxonomy describing them across three perspectives: Artistic Intention, Interaction and System Design. A web application is provided to dynamically visualize and explore the corpus of sound installations using interactive charts (https://isi-database.herokuapp.com/). Our contribution is two-sided: we provide a theoretical framework to characterize interactive sound installations as well as a tool to inform sound artists and designers about up-to-date practices regarding interactive sound installations design.
A FRAMEWORK FOR PERSONALIZATION OF INTERACTIVE SOUND SYNTHESIS
2005
ABSTRACT Instruments like the piano or guitar have a long tradition in many cultures such that non-musicians who encounter them understand that the piano keys can be pressed and the guitar strings can be plucked. Users of computer-based sound synthesis tools must use parameter names and interface feedback to develop a model of the available sound space of the instrument. Not all users may attribute the same weight to the parameters used by the tool designer.
Design exploration in interactive sonification
2012
This thesis examines interactive sonification, in particular the design, implementation and evaluation of user interface components using sound. It consists of a series of design interventions and explorations, including preliminary empirical investigations, showing features of this space. A novel search method and tool for finding sound files was created. The Sonic Browser utilised the human ability to listen to multiple simultaneous sounds, and facilitated users to switch attention between the sounds while navigating a virtual ...
Development of a Graphical Sound Synthesis Controller Exploring Cross-Modal Perceptual Analogies
2007
"Analogous perceptual experiences in multiple sensory modalities provide possibilities for abstract expression using modern computer-based audio-visual systems. This work examines the phenomenon of such cross-modal links with a focus on mappings between the auditory and visual realms. The design and implementation of a graphics-based sound synthesis controller using such inter-sensory associations is presented. A review of literature relevant to the design of computer-based musical instruments is provided, including discussions of parameter mapping, the use of graphical displays and computer vision as a gesture capture mechanism. An analysis of the software instruments realised and the physical interface setup is provided. System improvements and possible applications are also discussed with the direction of future work with the system being suggested."