A review of real-time EEG sonification research (original) (raw)

Prototyping a Method for the Assessment of Real-Time Eeg Sonifications

This paper presents a first step in the development of a methodology to compare the ability of different sonifications to convey the fine temporal detail of the Electroencephalography (EEG) brainwave signal in real time. In EEG neurofeedback a person " s EEG activity is monitored and presented back to them, to help them to learn how to modify their brain activity. Learning theory suggests that the more rapidly and accurately the feedback follows behaviour the more efficient the learning will be. Therefore a critical issue is how to assess the ability of a sonification to convey rapid and temporally complex EEG data for neurofeedback. To allow for replication, this study used sonifications of pre-recorded EEG data and asked participants to try and track aspects of the signal in real time using a mouse. This study showed that, although imperfect, this approach is a practical way to compare the suitability of EEG sonifications for tracking detailed EEG signals in real time and tha...

Multi-channel sonification of human EEG

2007

The electroencephalogram (EEG) provides a diagnostically important stream of multivariate data of the activity of the human brain. Various EEG sonification strategies have been proposed but auditory space has rarely been used to give cues about the location of specific events. Here we introduce a multivariate event-based sonification that, in addition to displaying salient rhythms, uses pitch and spatial location

Sonifications for Eeg Data Analysis

International Conference on Auditory Display, 2000

This paper presents techniques to render acoustic represen- tations for EEG data. In our case, data are obtained from psy- cholinguistic experiments where subjects are exposed to three dif- ferent conditions based on different auditory stimuli. The goal of this research is to uncover elements of neural processing cor- related with high-level cognitive activity. Three sonifications are presented within this

NEW SONIFICATION TOOLS FOR EEG DATA SCREENING AND MONITORING

This paper describes two software implementations for EEG data screening and realtime monitoring by means of sonification. Both have been designed in close collaboration with our partner institutions. Both tools were tested in depth with volunteers, and then tested with the expert users they are intended for, i.e. neurologists working with EEG data. In the course of these tests, a number of improvements to the designs were realised; tests and the final versions of the tools are described in detail. The scope of the paper is intended to provide an integrated description and analysis of all aspects of the design process from sonification design issues to interaction choices to user acceptance.

Event-based sonification of EEG rhythms in real time

Clinical Neurophysiology, 2007

Objective: To introduce a sound synthesis tool for human EEG rhythms that is applicable in real time. Methods: We design an event-based sonification which suppresses irregular background and highlights normal and pathologic rhythmic activity. Results: We generated sound examples with rhythms from well-known epileptic disorders and find stereotyped rhythmic auditory objects in single channel and stereo display from generalized spike-wave runs. For interictal activity, we were able to separate focal rhythms from background activity and thus enable the listener to perceive its frequency, duration, and intensity while monitoring. Conclusions: The proposed event-based sonification allows quick detection and identification of different types of rhythmic EEE events in real time and can thus be used to complement visual displays in monitoring and EEG feedback tasks. Significance: The significance of the work lies in the fact that it can be implemented for on-line monitoring of clinical EEG and for EEG feedback applications where continuous screen watching can be substituted or improved by the auditory information stream.

Sonification playback rates during matching tasks of visualised and sonified EEG data

In this paper, the authors discuss a user study examining the role of sonification in electroencephalography (EEG) data presentation. Conventionally, EEG data are presented using visualisation techniques and incorporate multivariate, time-critical information. As the number of EEG channels increase, or when screen real-estate is reduced, visually-presented data can become cluttered and occluded. Our user study examined how accurately users could match visualised EEG data to sonic equivalents, and at what playback rate this was most effective. Accuracy and timing data were recorded, as well as task load index (TLX) questionnaires. Results show that faster playback rates of sonified EEG data yield more accurate results. However, matching accuracy of sonified EEG data in the form presented in this study was not sufficient to replace visualized EEG. Although presently sonified electroencephalograms are not a complete replacement, sonification has the potential to effectively represent aspects of EEG data when visualisation alone becomes challenging for the user. The authors therefore propose a multimodal approach to EEG data presentation aimed at reducing visual clutter and reducing the cognitive load experienced by users when presented with too many dynamic variables on screen.

Real-Time Electroencephalogram Sonification for Neurofeedback

2018

Electroencephalography (EEG) is the measurement via the scalp of the electrical activity of the brain. The established therapeutic intervention of neurofeedback involves presenting people with their own EEG in real-time to enable them to modify their EEG for purposes of improving performance or health. The aim of this research is to develop and validate real-time sonifications of EEG for use in neurofeedback and methods for assessing such sonifications. Neurofeedback generally uses a visual display. Where auditory feedback is used, it is mostly limited to pre-recorded sounds triggered by the EEG activity crossing a threshold. However, EEG generates time-series data with meaningful detail at fine temporal resolution and with complex temporal dynamics. Human hearing has a much higher temporal resolution than human vision, and auditory displays do not require people to focus on a screen with their eyes open for extended periods of time – e.g. if they are engaged in some other task. Son...

The Role of Personalization and Multiple EEG and Sound Features Selection in Real Time Sonification for Neurofeedback

The field of physiology-based interaction and monitoring is developing at a fast pace. Emerging applications like fatigue monitoring often use sound to convey complex dynamics of biological signals and to provide an alternative, non-visual information channel. Most Physiology-to-Sound mappings in such auditory displays do not allow customization by the end-users. We designed a new sonification system that can be used for extracting, processing and displaying Electroencephalography data (EEG) with different sonification strategies. The system was validated with four user groups performing alpha/theta neurofeedback training (a/t) for relaxation that varied in feedback personalization (Personalized/Fixed) and a number of sonified EEG features (Single/Multiple). The groups with personalized feedback performed significantly better in their training than fixed mappings groups, as shown by both subjective ratings and physiological indices. Additionally, the higher number of sonified EEG features resulted in deeper relaxation than when training with single feature feedback. Our results demonstrate the importance of adaptation and personaliziation of EEG sonification according to particular applications, in our case, to a/t neurofeedback. Our experimental approach shows how user performance can be used for validating different sonification strategies.

A Neural Network Based EEG Temporal Pattern Sonification

This paper presents a technique to provide an acoustic representation of electroencephalogram (EEG) data using neural networks. The sample EEG consists of actual random movements of left and right hand recorded with eyes closed of a 21-year old, right handed male with no known medical conditions. In addition, an EEG signal simulator was used to generate random EEG signals aside from the actual EEG data used. Pre-data processing was done using short time Fourier transform (STFT) and singular value decomposition (SVD) techniques. A neural network (NN) based system was used to sonify the EEG data into an acoustic sound in the C5-B5 octave.