Music Lexical Networks (original) (raw)
Related papers
The neural substrates of musical memory revealed by fMRI and two semantic tasks
NeuroImage, 2010
Recognizing a musical excerpt without necessarily retrieving its title typically reflects the existence of a memory system dedicated to the retrieval of musical knowledge. The functional distinction between musical and verbal semantic memory has seldom been investigated. In this fMRI study, we directly compared the musical and verbal memory of 20 nonmusicians, using a congruence task involving automatic semantic retrieval and a familiarity task requiring more thorough semantic retrieval. In the former, participants had to access their semantic store to retrieve musical or verbal representations of melodies or expressions they heard, in order to decide whether these were then given the right ending or not. In the latter, they had to judge the level of familiarity of musical excerpts and expressions. Both tasks revealed activation of the left inferior frontal and posterior middle temporal cortices, suggesting that executive and selection processes are common to both verbal and musical retrievals. Distinct patterns of activation were observed within the left temporal cortex, with musical material mainly activating the superior temporal gyrus and verbal material the middle and inferior gyri. This cortical organization of musical and verbal semantic representations could explain clinical dissociations featuring selective disturbances for musical or verbal material.
Neuronal Correlates of Perception, Imagery, and Memory for Familiar Tunes
Journal of Cognitive Neuroscience, 2012
■ We used fMRI to investigate the neuronal correlates of encoding and recognizing heard and imagined melodies. Ten participants were shown lyrics of familiar verbal tunes; they either heard the tune along with the lyrics, or they had to imagine it. In a subsequent surprise recognition test, they had to identify the titles of tunes that they had heard or imagined earlier. The functional data showed substantial overlap during melody perception and imagery, including secondary auditory areas. During imagery compared with perception, an extended network including pFC, SMA, intraparietal sulcus, and cerebellum showed increased activity, in line with the increased processing demands of imagery. Functional connectivity of anterior right temporal cortex with frontal areas was increased during imagery compared with perception, indicating that these areas form an imageryrelated network. Activity in right superior temporal gyrus and pFC was correlated with the subjective rating of imagery vividness. Similar to the encoding phase, the recognition task recruited overlapping areas, including inferior frontal cortex associated with memory retrieval, as well as left middle temporal gyrus. The results present new evidence for the cortical network underlying goal-directed auditory imagery, with a prominent role of the right pFC both for the subjective impression of imagery vividness and for on-line mental monitoring of imagery-related activity in auditory areas. ■
Neural Correlates Underlying Musical Semantic Memory
Annals of the New York Academy of Sciences, 2009
Numerous functional imaging studies have examined the neural basis of semantic memory mainly using verbal and visuospatial materials. Musical material also allows an original way to explore semantic memory processes. We used PET imaging to determine the neural substrates that underlie musical semantic memory using different tasks and stimuli. The results of three PET studies revealed a greater involvement of the anterior part of the temporal lobe. Concerning clinical observations and our neuroimaging data, the musical lexicon (and most widely musical semantic memory) appears to be sustained by a temporo-prefrontal cerebral network involving right and left cerebral regions.
The neural correlates of musical and verbal semantic memory assessed by fMRI
Neuroimage, 2009
Recognizing a musical excerpt without necessarily retrieving its title typically reflects the existence of a memory system dedicated to the retrieval of musical knowledge. The functional distinction between musical and verbal semantic memory has seldom been investigated. In this fMRI study, we directly compared the musical and verbal memory of 20 nonmusicians, using a congruence task involving automatic semantic retrieval and a familiarity task requiring more thorough semantic retrieval. In the former, participants had to access their semantic store to retrieve musical or verbal representations of melodies or expressions they heard, in order to decide whether these were then given the right ending or not. In the latter, they had to judge the level of familiarity of musical excerpts and expressions. Both tasks revealed activation of the left inferior frontal and posterior middle temporal cortices, suggesting that executive and selection processes are common to both verbal and musical retrievals. Distinct patterns of activation were observed within the left temporal cortex, with musical material mainly activating the superior temporal gyrus and verbal material the middle and inferior gyri. This cortical organization of musical and verbal semantic representations could explain clinical dissociations featuring selective disturbances for musical or verbal material.
Personal familiarity of music and its cerebral effect on subsequent speech processing
Scientific Reports
Despite the obvious personal relevance of some musical pieces, the cerebral mechanisms associated with listening to personally familiar music and its effects on subsequent brain functioning have not been specifically evaluated yet. We measured cerebral correlates with functional magnetic resonance imaging (fMRI) while composers listened to three types of musical excerpts varying in personal familiarity and self (familiar own/composition, familiar other/favorite or unfamiliar other/unknown music) followed by sequences of names of individuals also varying in personal familiarity and self (familiar own/own name, familiar other/close friend and unfamiliar other/unknown name). Listening to music with autobiographical contents (familiar own and/or other) recruited a fronto-parietal network including mainly the dorsolateral prefrontal cortex, the supramarginal/angular gyri and the precuneus. Additionally, while listening to familiar other music (favorite) was associated with the activation...
Familiarity in music has been reported as an important factor modulating emotional and hedonic responses in the brain. Familiarity and repetition may increase the liking of a piece of music, thus inducing positive emotions. Neuroimaging studies have focused on identifying the brain regions involved in the processing of familiar and unfamiliar musical stimuli. However, the use of different modalities and experimental designs has led to discrepant results and it is not clear which areas of the brain are most reliably engaged when listening to familiar and unfamiliar musical excerpts. In the present study, we conducted a systematic review from three databases (Medline, PsychoINFO, and Embase) using the keywords (recognition OR familiar OR familiarity OR exposure effect OR repetition) AND (music OR song) AND (brain OR brains OR neuroimaging OR functional Magnetic Resonance Imaging OR Position Emission Tomography OR Electroencephalography OR Event Related Potential OR Magnetoencephalography). Of the 704 titles identified, 23 neuroimaging studies met our inclusion criteria for the systematic review. After removing studies providing insufficient information or contrasts, 11 studies (involving 212 participants) qualified for the meta-analysis using the activation likelihood estimation (ALE) approach. Our results did not find significant peak activations consistently across included studies. Using a less conservative approach (p < 0.001, uncorrected for multiple comparisons) we found that the left superior frontal gyrus, the ventral lateral (VL) nucleus of the left thalamus, and the left medial surface of the superior frontal gyrus had the highest likelihood of being activated by familiar music. On the other hand, the left insula, and the right anterior cingulate cortex had the highest likelihood of being activated by unfamiliar music. We had expected limbic structures as top clusters when listening to familiar music. But, instead, music familiarity had a motor pattern of activation. This could reflect an audio-motor synchronization to the rhythm which is more
Introduction to The Neurosciences and Music IV: Learning and Memory
Annals of the New York Academy of Sciences, 2012
Two opening workshops, three large and vibrant poster sessions, and nine invited symposia introduced a diverse range of recent research findings and discussed current research directions. Here, the proceedings are introduced by the workshop and symposia leaders on topics including working with children, rhythm perception, language processing, cultural learning, memory, musical imagery, neural plasticity, stroke rehabilitation, autism, and amusia. The rich diversity of the interdisciplinary research presented suggests that the future of music neuroscience looks both exciting and promising, and that important implications for music rehabilitation and therapy are being discovered.
Superior Formation of Cortical Memory Traces for Melodic Patterns in Musicians
Learning & Memory, 2001
The human central auditory system has a remarkable ability to establish memory traces for invariant features in the acoustic environment despite continual acoustic variations in the sounds heard. By recording the memory-related mismatch negativity (MMN) component of the auditory electric and magnetic brain responses as well as behavioral performance, we investigated how subjects learn to discriminate changes in a melodic pattern presented at several frequency levels. In addition, we explored whether musical expertise facilitates this learning. Our data show that especially musicians who perform music primarily without a score learn easily to detect contour changes in a melodic pattern presented at variable frequency levels. After learning, their auditory cortex detects these changes even when their attention is directed away from the sounds. The present results thus show that, after perceptual learning during attentive listening has taken place, changes in a highly complex auditory pattern can be detected automatically by the human auditory cortex and, further, that this process is facilitated by musical expertise.