Infant speech perception bootstraps word learning (original) (raw)

How Infant Speech Perception Contributes to Language Acquisition

Language and Linguistics Compass, 2008

Perceiving the acoustic signal as a sequence of meaningful linguistic representations is a challenging task, which infants seem to accomplish effortlessly, despite the fact that they do not have a fully developed knowledge of language. The present article takes an integrative approach to infant speech perception, emphasizing how young learners' perception of speech helps them acquire abstract structural properties of language. We introduce what is known about infants' perception of language at birth. Then, we will discuss how perception develops during the first 2 years of life and describe some general perceptual mechanisms whose importance for speech perception and language acquisition has recently been established. To conclude, we discuss the implications of these empirical findings for language acquisition.

Learning words’ sounds before learning how words sound: 9-Month-olds use distinct objects as cues to categorize speech information

Cognition, 2009

One of the central themes in the study of language acquisition is the gap between the linguistic knowledge that learners demonstrate, and the apparent inadequacy of linguistic input to support induction of this knowledge. One of the first linguistic abilities in the course of development to exemplify this problem is in speech perception: specifically, learning the sound system of one's native language. Native-language sound systems are defined by meaningful contrasts among words in a language, yet infants learn these sound patterns before any significant numbers of words are acquired. Previous approaches to this learning problem have suggested that infants can learn phonetic categories from statistical analysis of auditory input, without regard to word referents. Experimental evidence presented here suggests instead that young infants can use visual cues present in wordlabeling situations to categorize phonetic information. In Experiment 1, 9-month-old English-learning infants failed to discriminate two non-native phonetic categories, establishing baseline performance in a perceptual discrimination task. In Experiment 2, these infants succeeded at discrimination after watching contrasting visual cues (i.e., videos of two novel objects) paired consistently with the two non-native phonetic categories. In Experiment 3, these infants failed at discrimination after watching the same visual cues, but paired inconsistently with the two phonetic categories. At an age before which memory of word labels is demonstrated in the laboratory, 9-month-old infants use contrastive pairings between objects and sounds to influence their phonetic sensitivity. Phonetic learning may have a more functional basis than previous statistical learning mechanisms assume: infants may use cross-modal associations inherent in social contexts to learn native-language phonetic categories.

INFLUENCES ON INFANT SPEECH PROCESSING: Toward a New Synthesis

Annual Review of Psychology, 1999

To comprehend and produce language, we must be able to recognize the sound patterns of our language and the rules for how these sounds "map on" to meaning. Human infants are born with a remarkable array of perceptual sensitivities that allow them to detect the basic properties that are common to the world's languages. During the first year of life, these sensitivities undergo modification reflecting an exquisite tuning to just that phonological information that is needed to map sound to meaning in the native language. We review this transition from language-general to language-specific perceptual sensitivity that occurs during the first year of life and consider whether the changes propel the child into word learning. To account for the broad-based initial sensitivities and subsequent reorganizations, we offer an integrated transactional framework based on the notion of a specialized perceptualmotor system that has evolved to serve human speech, but which functions in concert with other developing abilities. In so doing, we highlight the links between infant speech perception, babbling, and word learning.

Infants’ advances in speech perception shape their earliest links between language and cognition

Scientific Reports

the power of human language derives not only from the precision of its signal or the complexity of its grammar, but also from its links to cognition. Infants as young as 3 months have begun to link language and core cognitive capacities. At 3 and 4 months, this link is not exclusive to human language: listening to vocalizations of nonhuman primates also supports infant cognition. By 6 months, infants have tuned this link to human speech alone. Here we provide evidence that infants' increasing precision in speech perception shapes which signals they will link to cognition. Infants listening to German, a nonnative language that shares key rhythmic and prosodic properties with their own native language (english), successfully formed object categories. In contrast, those listening to Cantonese, a language that differs considerably in these suprasegmental properties, failed. This provides the first evidence that infants' increasingly precise perceptual tuning to the sounds of their native language sets constraints on the range of human languages they will link to cognition: infants begin to specify which human languages they will link to core cognitive capacities even before they sever the link between nonhuman primate vocalizations and cognition.

When Half a Word Is Enough: Infants Can Recognize Spoken Words Using Partial Phonetic Information

Child Development, 2001

Adults process speech incrementally, rapidly identifying spoken words on the basis of initial phonetic information sufficient to distinguish them from alternatives. In this study, infants in the second year also made use of word-initial information to understand fluent speech. The time course of comprehension was examined by tracking infants' eye movements as they looked at pictures in response to familiar spoken words, presented both as whole words in intact form and as partial words in which only the first 300 ms of the word was heard. In Experiment 1, 21-month-old infants ( N ϭ 32) recognized partial words as quickly and reliably as they recognized whole words; in Experiment 2, these findings were replicated with 18-month-old infants ( N ϭ 32). Combining the data from both experiments, efficiency in spoken word recognition was examined in relation to level of lexical development. Infants with more than 100 words in their productive vocabulary were more accurate in identifying familiar words than were infants with less than 60 words. Grouped by response speed, infants with faster mean reaction times were more accurate in word recognition and also had larger productive vocabularies than infants with slower response latencies. These results show that infants in the second year are capable of incremental speech processing even before entering the vocabulary spurt, and that lexical growth is associated with increased speed and efficiency in understanding spoken language.

Learning words and learning sounds: Advances in language development

British journal of psychology (London, England : 1953), 2016

Phonological development is sometimes seen as a process of learning sounds, or forming phonological categories, and then combining sounds to build words, with the evidence taken largely from studies demonstrating 'perceptual narrowing' in infant speech perception over the first year of life. In contrast, studies of early word production have long provided evidence that holistic word learning may precede the formation of phonological categories. In that account, children begin by matching their existing vocal patterns to adult words, with knowledge of the phonological system emerging from the network of related word forms. Here I review evidence from production and then consider how the implicit and explicit learning mechanisms assumed by the complementary memory systems model might be understood as reconciling the two approaches.

The Roots of the Early Vocabulary in Infants' Learning From Speech

Current Directions in Psychological Science, 2008

Psychologists have known for over 20 years that infants begin learning the speech-sound categories of their language during the first 12 months of life. This fact has dominated researchers' thinking about how language acquisition begins, although the relevance of this learning to the child's progress in language acquisition has never been clear. Recently, views of the role of infancy in language acquisition have begun to change, with a new focus on the development of the vocabulary. Infants' learning of speech-sound categories and infants' abilities to extract regularities in the speech stream allow learning of the auditory forms of many words. These word forms then become the foundation of the early vocabulary, support children's learning of the language's phonological system, and contribute to the discovery of grammar.