Interactive sonification of neural activity (original) (raw)

Interactive software for the sonification of neuronal activity

2020

We present a software for the sonification of neuronal activity acquired by calcium imaging of neurons in the brain of zebrafish larvae. The objective was to facilitate the observation of the temporal and spatial patterns of activity as well as the synchronous phenomena between different neurons. The parameter mapping approach was used. The method we present relies on manually drawn regions of interest (ROI), whereby, the average fluorescent values of the ROIs are computed to drive FM synthesis parameters. The ROIs represent neurons and each ROI is associated with a FM synthesizer producing sound signal. An interface was developed to allow one to tune the parameters of the sonification and manage the data sets. This paper presents the different mapping that were tested, their implementation in the software, and the interface that was developed.

EXPLORING 3D AUDIO FOR BRAIN SONIFICATION

Brain activity data, measured by functional Magnetic Resonance Imaging (fMRI), produces extremely high dimensional, sparse and noisy signals which are difficult to visualize, monitor and analyze. The use of spatial music can be particularly appropriate to represent its contained patterns. The literature describes several research done on sonifying neuroimaging data as well as different techniques to use spatialization as a musical language. In this paper, we discuss an artistic approach to fMRI sonification exploiting new compositional paradigms in spatial music. Therefore, we consider the brain activity as audio base material of a the spatial musical composition. Our approach attempts to explore the aesthetic potential of brain sonification not by transforming the data beyond the recognizable, but presenting the data as direct as possible.

Immersive Sonification for Displaying Brain Scan Data

Proceedings of the International Conference on Health Informatics, 2013

Scans of brains result in data that can be challenging to display due to its complexity, multi-dimensionality, and range. Visual representations of such data are limited due to the nature of the display, the number of possible dimensions that can be represented visually, and the capacity of our visual system to perceive and interpret visual data. This paper describes the use of sonification to interpret brain scans and use sound as a complementary tool to view, analyze, and diagnose. The sonification tool SoniScan is described and evaluated as a method to augment visual brain data display. 2 CHALLENGES OF IMAGING A broad category of diagnostic testing, which has revolutionized diagnosis and management of disease, is medical imaging. A variety of techniques based on x-rays, mechanical vibration, fluorescence, rotation of atoms and radioactive decay have been developed and produce multi-dimensional and timevarying arrays of spatial information. Current methodologies used to "perceive" and interpret medical image data are largely based on the human visual system, in some cases enhanced by use 24 Roginska A

Real-Time Wireless Sonification of Brain Signals

In this paper, an alternative representation of EEG is investigated, in particular, translation of EEG into sound; patterns in the EEG then correspond to sequences of notes. The aim is to provide an alternative tool for analysing and exploring brain signals, e.g., for diagnosis of neurological diseases. Specifically, a system is proposed that transforms EEG signals, recorded by a wireless headset, into sounds in real-time. In order to assess the resulting representation of EEG as sounds, the proposed sonification system is applied to EEG signals of Alzheimer's (AD) patients and healthy age-matched control subjects (recorded by a high-quality wired EEG system). Fifteen volunteers were asked to classify the sounds generated from the EEG of 5 AD patients and five healthy subjects; the volunteers labeled most M. Elgendi ( )

Sonifications for Eeg Data Analysis

International Conference on Auditory Display, 2000

This paper presents techniques to render acoustic represen- tations for EEG data. In our case, data are obtained from psy- cholinguistic experiments where subjects are exposed to three dif- ferent conditions based on different auditory stimuli. The goal of this research is to uncover elements of neural processing cor- related with high-level cognitive activity. Three sonifications are presented within this

Sonification of Large Datasets in a 3D Immersive Environment: A Neuroscience Case Study

—Auditory display techniques can play a key role in the understanding of hidden patterns in large datasets. In this study, we investigated the role of sonification applied to an immersive 3D visualization of a complex network dataset. As a test case, we used a 3D interactive visualization of the so called, connectome of the human brain, in the immersive space called " eXperience Induction Machine (XIM) ". We conducted an empirical validation where subjects were asked to perform a navigation task through the network and were subsequently tested for their understanding of the dataset. Our results showed that sonification provides a further layer of understanding of the dynamics of the network by enhancing the subjects' structural understanding of the data space.

Multi-channel sonification of human EEG

2007

The electroencephalogram (EEG) provides a diagnostically important stream of multivariate data of the activity of the human brain. Various EEG sonification strategies have been proposed but auditory space has rarely been used to give cues about the location of specific events. Here we introduce a multivariate event-based sonification that, in addition to displaying salient rhythms, uses pitch and spatial location

Guest Editors' Introduction: An Introduction to Interactive Sonification

IEEE Multimedia, 2005

T he research field of sonification, a subset of the topic of auditory display, has developed rapidly in recent decades. It brings together interests from the areas of data mining, exploratory data analysis, human-computer interfaces, and computer music. Sonification presents information by using sound (particularly nonspeech), so that the user of an auditory display obtains a deeper understanding of the data or processes under investigation by listening. 1 We define interactive sonification as the use of sound within a tightly closed human-computer interface where the auditory signal provides information about data under analysis, or about the interaction itself, which is useful for refining the activity.

Spatial Sonification of Data

This project presents a model to sonify data. It uses a multichannel audio system to create a spatial illusion of the presence of the sound source. This in order to present with a way of translating information from other domains into sound, at the same time that the spatial position conveys information also from this data.

Taxonomy and Definitions for Sonification and Auditory Display

2008

Sonification is still a relatively young research field and many terms such as sonification, auditory display, auralization, audification have been used without a precise definition. Recent developments such as the introduction of Model-Based Sonification, the establishment of interactive sonification and the increased interest in sonification from arts have raised the need to revisit the definitions in order to move towards a clearer terminology. This paper introduces a new definition for sonification and auditory display that emphasizes the necessary and sufficient conditions for organized sound to be called sonification. It furthermore suggests a taxonomy, and discusses the relation between visualization and sonification. A hierarchy of closed-loop interactions is furthermore introduced. This paper aims to initiate vivid discussion towards the establishment of a deeper theory of sonification and auditory display.