Retrieving musical information from neural data: how cognitive features enrich acoustic ones (original) (raw)
Related papers
Music-induced emotions can be predicted from a combination of brain activity and acoustic features
Brain and cognition, 2015
It is widely acknowledged that music can communicate and induce a wide range of emotions in the listener. However, music is a highly-complex audio signal composed of a wide range of complex time- and frequency-varying components. Additionally, music-induced emotions are known to differ greatly between listeners. Therefore, it is not immediately clear what emotions will be induced in a given individual by a piece of music. We attempt to predict the music-induced emotional response in a listener by measuring the activity in the listeners electroencephalogram (EEG). We combine these measures with acoustic descriptors of the music, an approach that allows us to consider music as a complex set of time-varying acoustic features, independently of any specific music theory. Regression models are found which allow us to predict the music-induced emotions of our participants with a correlation between the actual and predicted responses of up to r=0.234,p<0.001. This regression fit suggests...
Revealing Preference in Popular Music Through Familiarity and Brain Response
IEEE Sensors Journal
Music preference was reported as a factor, which could elicit innermost music emotion, entailing accurate ground-truth data and music therapy efficiency. This study executes statistical analysis to investigate the distinction of music preference through familiarity scores, response times (response rates), and brain response (EEG). Twenty participants did self-assessment after listening to two types of popular music's chorus section: music without lyrics (Melody) and music with lyrics (Song). We then conduct a music preference classification using a support vector machine (SVM) with the familiarity scores, the response rates, and EEG as the feature vectors. The statistical analysis and SVM's F1-score of EEG are congruent, which is the brain's right side outperformed its left side in classification performance. Finally, these behavioral and brain studies support that preference, familiarity, and response rates can contribute to the music emotion experiment's design to understand music, emotion, and listener. Not only to the music industry, the biomedical, and healthcare industry can also exploit this experiment to collect data from patients to improve the efficiency of healing by music.
Psychomusicology: Music, Mind, and Brain, 2012
Recent neuroscience research has shown increasing use of multivariate decoding methods and machine learning. These methods, by uncovering the source and nature of informative variance in large data sets, invert the classical direction of inference that attempts to explain brain activity from mental state variables or stimulus features. However, these techniques are not yet commonly used among music researchers. In this position article, we introduce some key features of machine learning methods and review their use in the field of cognitive and behavioral neuroscience of music. We argue for the great potential of these methods in decoding multiple data types, specifically audio waveforms, electroencephalography, functional MRI, and motion capture data. By finding the most informative aspects of stimulus and performance data, hypotheses can be generated pertaining to how the brain processes incoming musical information and generates behavioral output, respectively. Importantly, these methods are also applicable to different neural and physiological data types such as magnetoencephalography, near-infrared spectroscopy, positron emission tomography, and electromyography.
Advances in Intelligent Systems and Computing, 2020
Semantic differential is often used to investigate the relationship between music and other sensory modalities such as colors, tastes, vision, and odors. This work proposes an exploratory approach including open-ended responses and subsequent machine learning to study cross-modal associations, based on a recently developed sensory scale that does not use any explicit verbal description. Twenty-five participants were asked to report a piece of music they considered close to the feel/look/experience of a given sensory stimulus. Results show that the associations reported by the participants can be explained, at least in part, by a set of features related to some timbric and tonal aspects of music.
Frontiers in Human Neuroscience, 2016
Emotion-related areas of the brain, such as the medial frontal cortices, amygdala, and striatum, are activated during listening to sad or happy music as well as during listening to pleasurable music. Indeed, in music, like in other arts, sad and happy emotions might co-exist and be distinct from emotions of pleasure or enjoyment. Here we aimed at discerning the neural correlates of sadness or happiness in music as opposed those related to musical enjoyment. We further investigated whether musical expertise modulates the neural activity during affective listening of music. To these aims, 13 musicians and 16 non-musicians brought to the lab their most liked and disliked musical pieces with a happy and sad connotation. Based on a listening test, we selected the most representative 18 sec excerpts of the emotions of interest for each individual participant. Functional magnetic resonance imaging (fMRI) recordings were obtained while subjects listened to and rated the excerpts. The cortico-thalamo-striatal reward circuit and motor areas were more active during liked than disliked music, whereas only the auditory cortex and the right amygdala were more active for disliked over liked music. These results discern the brain structures responsible for the perception of sad and happy emotions in music from those related to musical enjoyment. We also obtained novel evidence for functional differences in the limbic system associated with musical expertise, by showing enhanced liking-related activity in fronto-insular and cingulate areas in musicians.
Neural and physiological data from participants listening to affective music
Scientific Data
Music provides a means of communicating affective meaning. However, the neurological mechanisms by which music induces affect are not fully understood. Our project sought to investigate this through a series of experiments into how humans react to affective musical stimuli and how physiological and neurological signals recorded from those participants change in accordance with self-reported changes in affect. In this paper, the datasets recorded over the course of this project are presented, including details of the musical stimuli, participant reports of their felt changes in affective states as they listened to the music, and concomitant recordings of physiological and neurological activity. We also include non-identifying meta data on our participant populations for purposes of further exploratory analysis. This data provides a large and valuable novel resource for researchers investigating emotion, music, and how they affect our neural and physiological activity.
Music impinges upon the body and the brain. As such, it has significant inductive power which relies both on innate dispositions and acquired mechanisms and competencies. The processes are partly autonomous and partly deliberate, and interrelations between several levels of processing are becoming clearer with accumulating new evidence. For instance, recent developments in neuroimaging techniques, have broadened the field by encompassing the study of cortical and subcortical processing of the music. The domain of musical emotions is a typical example with a major focus on the pleasure that can be derived from listening to music. Pleasure, however, is not the only emotion to be induced and the mechanisms behind its elicitation are far from understood. There are also mechanisms related to arousal and activation that are both less differentiated and at the same time more complex than the assumed mechanisms that trigger basic emotions. It is imperative, therefore, to investigate what pleasurable and mood modifying effects music can have on human beings in real-time listening situations. This e-book is an attempt to answer these questions. Revolving around the specificity of music experience in terms of perception, emotional reactions, and aesthetic assessment, it presents new hypotheses, theoretical claims as well as new empirical data which contribute to a better understanding of the functions of the brain as related to musical experience.
Predictions and the brain: how musical sounds become rewarding
Trends in Cognitive Sciences, 2015
Music has always played a central role in human culture. The question of how musical sounds can have such profound emotional and rewarding effects has been a topic of interest throughout generations. At a fundamental level, listening to music involves tracking a series of sound events over time. Because humans are experts in pattern recognition, temporal predictions are constantly generated, creating a sense of anticipation. We summarize how complex cognitive abilities and cortical processes integrate with fundamental subcortical reward and motivation systems in the brain to give rise to musical pleasure. This work builds on previous theoretical models that emphasize the role of prediction in music appreciation by integrating these ideas with recent neuroscientific evidence.