Effects of aging and musical experience on the representation of tonal hierarchies (original) (raw)

Changes in the Perception and the Psychological Structure of Musical Emotions with Advancing Age

Experimental Aging Research, 2012

This study was designed to test whether there are age-related changes in emotional judgments and psychological structure for musical emotions. Twenty-five older and 25 younger listeners performed emotional judgments and free categorization tasks on happy, peaceful, sad and threatening musical excerpts. Compared to younger adults, older adults did not discriminate the arousal difference between peaceful and threatening excerpts and showed higher association between arousal and valence judgments. A multidimensional scaling analysis indicated that the emotional space showed by older listeners did not fit younger listeners' bidimensional valence-arousal structure. There was also a better categorization for happy excerpts among the older group. Altogether, these data are consistent with the view that advancing age may result in the reduction of emotional complexity and a distortion of the emotional processing in a positive direction.

The Perception of Tonal Structure Through the Differentiation and Organization of Pitches.

Journal of Experimental Psychology: …, 2004

The role of 2 psychological processes, differentiation and organization, were examined in the perception of musical tonality. Differentiation distinguishes elements from one another and was varied in terms of the distribution of pitch durations within tone sequences. Organization establishes relations between differentiated elements and was varied in terms of either conformity with or deviation from a hierarchical description of tonality. Multiple experiments demonstrated that the perception of tonality depended on a minimal degree of differentiation in the distribution of the duration-but not frequency of occurrence-of pitches and only when pitch distributions were hierarchically organized. Moreover, the mere differentiation of the tonic from nontonic pitches was not sufficient to induce tonal percepts. These results are discussed in relation to tonal strength, musical expressiveness, and principles of auditory pattern processing.

Probing the Minor Tonal Hierarchy

Music Perception: An Interdisciplinary Journal, 2011

Two experiments investigated psychological representations of musical tonality in auditory imagery. In Experiment 1, musically trained participants heard a single tone as a perceptual cue and built an auditory image of a specified major tonality based on that cue; participants' images were then assessed using judgments of probe tones. In Experiment 2 participants imaged a minor tonality rather than a major one. Analysis of the probe tone ratings indicated that participants successfully imaged both major and minor tonal hierarchies, demonstrating that auditory imagery functions comparably to auditory perception. In addition, the strength of the major tonal image was dependent upon the pitch and tonal relations of the perceptual cue and the to-beimaged tonality. Finally, representations of minor tonal hierarchies were less robust than those of major ones, converging with perceptual evidence that minor tonalities are less psychologically stable than major tonalities.

The perception of tonal intervals in isolation and in melodic context

Psychomusicology: A Journal of Research in Music Cognition, 1982

Twenty-six musicians judged 13 successive ascending tonal inter vals on labeling and discrimination tasks. Stimuli ranged from 480 to 720 cents, in 20-cent increments. In the labeling test, musi cians identified both interval category and intonation quality (flat, sharp, or in tune) of intervals. In the discrimination tests, musi cians determined if the second of two intervals heard on each trial was larger, smaller, or the same as the first. Intervals were heard in isolation and as the last two notes of a ten-note tonal melodic fragment during both tasks. Results showed musicians to be more accurate on both tasks when intervals were presented in melodic context than when they were presented in isolation. In addition, although musicians responsed categorically to intervals presented both in isolation and in melodic context, they categorized to a lesser extent when intervals were presented in context. Results from the labeling task demonstrated that musicians discriminated sharp from flat intonation quality for both types of interval pres entations. These results suggest that musicians may not catego rize pitch structures in musical situations to the extent that they have been shown to do so in laboratory settings.

Losing the Music: Aging Affects the Perception and Subcortical Neural Representation of Musical Harmony

When two musical notes with simple frequency ratios are played simultaneously, the resulting musical chord is pleasing and evokes a sense of resolution or “consonance”. Complex frequency ratios, on the other hand, evoke feelings of tension or “dissonance”. Consonance and dissonance form the basis of harmony, a central component of Western music. In earlier work, we provided evidence that consonance perception is based on neural temporal coding in the brainstem (Bones et al., 2014). Here, we show that for listeners with clinically normal hearing, aging is associated with a decline in both the perceptual distinction and the distinctiveness of the neural representations of different categories of two-note chords. Compared with younger listeners, older listeners rated consonant chords as less pleasant and dissonant chords as more pleasant. Older listeners also had less distinct neural representations of consonant and dissonant chords as measured using a Neural Consonance Index derived from the electrophysiological “frequency-following response.” The results withstood a control for the effect of age on general affect, suggesting that different mechanisms are responsible for the perceived pleasantness of musical chords and affective voices and that, for listeners with clinically normal hearing, age-related differences in consonance perception are likely to be related to differences in neural temporal coding.

The Development of Musical Preference across the Life Span

This paper focuses on the relationship between the cognitive and affective aspects of listening to various music styles. The document describes two cross-sectional studies of the same sample of 275 subjects drawn from 5 age groups (9-10 years; 14-15 years; 18-24 years; 25-49 years; and 50+ years). The first study deals with the development of tolerance for musical styles across the life span, and the second with the rated eminence of pop music artists. Both of these encompass aspects of stylistic knowledge and preference, and the results are discussed in terms of this cognitive-affective distinction. (EH)

Effect of musical experience on tonal language perception

The Journal of the Acoustical Society of America, 2014

Both functional and structural overlap between musical experience and language learning ability have been discussed and debated in the psychology literature. Of interest in the study presented here is the relationship between tonal structures in music and in language. We tested the relationship between musical training and the ability to identify Mandarin tones in a group of non-Mandarin speakers. The dependent variable was the accuracy of tone identification in the mandarin phrases. A simple questionnaire was used to measure musical experience. Musical training and identification accuracy were related. Non-significant effects are discussed below.

Age trends in musical preferences in adulthood: 3. Perceived musical attributes as intrinsic determinants of preferences

Increased age has been found to be associated with differences in musical preferences in adulthood. In past research, these differences were mostly attributed to changes in the social context. However, these influences were small and a large proportion of variance in age trends in musical preferences still remains to be explained. The aim of this article is to investigate the hypothesis that age trends in musical preferences are related to differences in preferences for some intrinsic attributes of the music in line with the Music Preferences in Adulthood Model (Bonneville-Roussy et al., 2017). Adult participants (N = 481) were asked to rate their preferences for extracts of 51 audio-music recordings (music clips) and musical attributes related to dynamics, pitch, structure, tempo, and timbre. Audio-features of the 51 clips were extracted using Music Information Retrieval methods. Using self-report, we found that the musical preferences of adults were linked with distinct likings for musical attributes, with large effects. We also discovered that self-rated attributes associated with dynamics and timbre moderated the links between age and musical preferences. Using the extracted features, we found that musical preferences were linked with distinct patterns of musical features. Finally, we established that the patterns of preferences of emerging, young and middle-aged adults were increasingly influenced by audio-features of timbre, dynamics and tonal clarity. These findings suggest that age trends in musical preferences could be partially explained by differences in the ways individuals process the intrinsic attributes of the music with age.

Lexical tone perception in musicians and non-musicians

2005

It has been suggested that music and speech maintain entirely dissociable mental processing systems. The current study, however, provides evidence that there is an overlap in the processing of certain shared aspects of the two. This study focuses on fundamental frequency (pitch), which is an essential component of melodic units in music and lexical and/or intonational units in speech. We hypothesize that extensive experience with the processing of musical pitch can transfer to a lexical pitch-processing domain. To that end, we asked nine English-speaking musicians and nine Englishspeaking non-musicians to identify and discriminate the four lexical tones of Mandarin Chinese. The subjects performed significantly differently on both tasks; the musicians identified the tones with 89% accuracy and discriminated them with 87% accuracy, while the non-musicians identified them with only 69% accuracy and discriminated them with 71% accuracy. These results provide counter-evidence to the theory of dissociation between music and speech processing.