CONTEXTUAL INQUIRY FOR A CLIMATE AUDIO INTERFACE (original) (raw)
Related papers
USER CENTERED AUDIO INTERFACE FOR CLIMATE SCIENCE
This paper presents a user centred design approach to create an audio interface in the context of climate science. Contextual inquiry including think-aloud protocols was used to gather data about scientists' workflows in a first round of interviews. Furthermore, focus groups were used to gather information about the specific use of language by climate experts. The interviews have been analysed for their language content as well. Two goals are envisaged with this basic assessment. First, a climate terminology will help realising a domain-specific description of the sonifications that are understandable in the field. Second, identifying metaphors can help building a metaphoric sound identity for the sonification. An audio interface shall enrich their perceptualisation possibilities, based on the language metaphors derived from the interviews. Later, in a separate set of experiments, the participants were asked to pair sound stimuli with climate terms extracted from the first interviews and evaluate the sound samples aesthetically. They were asked to choose sound textures (from a set of sounds given to them) that best express the specific climate parameter and rate the relevance of the sound to the metaphor. Correlations between climate terminology and sound stimuli for the sonification tool is assessed to improve the sound design. Intuitiveness, learnability, memorability, and aesthetic preference of the sounds is measured by evaluations.
Frontiers in Psychology
IntroductionIt has proven a hard challenge to stimulate climate action with climate data. While scientists communicate through words, numbers, and diagrams, artists use movement, images, and sound. Sonification, the translation of data into sound, and visualization, offer techniques for representing climate data with often innovative and exciting results. The concept of sonification was initially defined in terms of engineering, and while this view remains dominant, researchers increasingly make use of knowledge from electroacoustic music (EAM) to make sonifications more convincing.MethodsThe Aesthetic Perspective Space (APS) is a two-dimensional model that bridges utilitarian-oriented sonification and music. We started with a review of 395 sonification projects, from which a corpus of 32 that target climate change was chosen; a subset of 18 also integrate visualization of the data. To clarify relationships with climate data sources, we determined topics and subtopics in a hierarchi...
Facilitating Reflection on Climate Change Using Interactive Sonification
Zenodo (CERN European Organization for Nuclear Research), 2023
This study explores the possibility of using musical soundscapes to facilitate reflection on the impacts of climate change. By sonifying historic and future climate data, an interactive timeline was created where the user can explore a soundscape changing in time. A prototype was developed and tested in a user study with 15 participants. Results indicate that the prototype successfully elicits the emotions that it was designed to communicate and that it does influence the participants' reflections. However, it remains uncertain how much the prototype actually helped them while reflecting.
Using sound to represent uncertainty in future climate projections for the United Kingdom
Proceedings of the 17th International Conference …, 2011
This paper compares different visual and sonic methods of representing uncertainty in spatial data. When handling large volumes of spatial data, users can be limited in the amount that can be displayed at once due to visual saturation (when no more data can be shown visually without obscuring existing data). Using sound in combination with visual methods may help to represent uncertainty in spatial data and this example uses the UK Climate Predictions 2009 (UKCP09) dataset; where uncertainty has been included for the first time. Participants took part in the evaluation via a web-based interface which used the Google Maps API to show the spatial data and capture user inputs. Using sound and vision together to show the same variable may be useful to colour blind users. Previous awareness of the data set appears to have a significant impact (p < 0.001) on participants ability to utilise the sonification. Using sound to reinforce data shown visually results in increased scores (p = 0.005) and using sound to show some data instead of vision showed a significant increase in speed without reducing effectiveness (p = 0.033) with repeated use of the sonification.
Humanising data through sound: Res Extensae and a user-centric approach to data sonification
2018
In this paper, starting from a case study (the mixed-media data sonification installation Res Extensae), we discuss a number of assumptions on the efficacy of sound as a means to represent and communicate numerical data. The discussion is supported by the results of a questionnaire aimed at validating our assumptions and conducted with fifteen of the participants to the experience. At the same time, we have the ambition to contribute to a wider debate on the value of data sonification. We introduce the first stage of a research on sonification as a design-driven, user-centred and multi-modal experience, in that closer to data design practices rather than to traditional composition and computer music. We describe the usage of physical objects to help users to put sounds and data into a wider context, improving the user experience and facilitating the comprehension and retention of the meaning of data.
Collaborative Exploration and Sensemaking of Big Environmental Sound Data
Computer Supported Cooperative Work (CSCW), 2017
Many ecologists are using acoustic monitoring to study animals and the health of ecosystems. Technological advances mean acoustic recording of nature can now be done at a relatively low cost, with minimal disturbance, and over long periods of time. Vast amounts of data are gathered yielding environmental soundscapes which requires new forms of visualization and interpretation of the data. Recently a novel visualization technique has been designed that represents soundscapes using dense visual summaries of acoustic patterns. However, little is known about how this visualization tool can be employed to make sense of soundscapes. Understanding how the technique can be best used and developed requires collaboration between interface, algorithm designers and ecologists. We empirically investigated the practices and needs of ecologists using acoustic monitoring technologies. In particular, we investigated the use of the soundscape visualization tool by teams of ecologists researching endangered species detection, species behaviour, and monitoring of ecological areas using long duration audio recordings. Our findings highlight the opportunities and challenges that ecologists face in making sense of large acoustic datasets through patterns of acoustic events. We reveal the characteristic processes for collaboratively generating situated accounts of natural places from soundscapes using visualization. We also discuss the biases inherent in the approach. Big data from nature has different characteristics from social and informational data sources that comprise much of the World Wide Web. We conclude with design implications for visual interfaces to facilitate collaborative exploration and discovery through soundscapes.
Design and outcomes of an acoustic data visualization seminar
The Journal of the Acoustical Society of America, 2014
Recently, the Department of Media Technology at Aalto University offered a seminar entitled Applied Data Analysis and Visualization. The course used spatial impulse response measurements from concert halls as the context to explore high-dimensional data visualization methods. Students were encouraged to represent source and receiver positions, spatial aspects, and temporal development of sound fields, frequency characteristics, and comparisons between halls, using animations and interactive graphics. The primary learning objectives were for the students to translate their skills across disciplines and gain a working understanding of high-dimensional data visualization techniques. Accompanying files present examples of student-generated, animated and interactive visualizations.
Sonifying for Public Engagement: A Context-based Model for Sonifying Air Pollution Data
In this paper we report on a unique and contextually-sensitive approach to sonification of a subset of climate data: urban air pollution for four Canadian cities. Similarly to other data-driven models for sonification and auditory display, this model details an approach to data parameter mappings, however we specifically consider the context of a public engagement initiative and a reception by an 'everyday' listener, which informs our design. Further, we present an innovative model for FM index-driven sonification that rests on the notion of 'harmonic identities' for each air pollution data parameter sonified, allowing us to sonify more datasets in a perceptually 'economic' way. Finally, we briefly discuss usability and design implications and outline future work.