An ERP study of motor compatibility effects in action language (original) (raw)

How Action Context Modulates the Action-Language Relationship: A Topographic ERP Analysis

Brain Topography, 2019

The aim of the present study was to investigate how the context in which an action is presented could modulate the effect of action observation on language processing, an effect that is classically observed in the literature. To address this question, we recorded both behavioral (reaction times) and electrophysiological measures (event-related potentials) of participants performing a semantic decision task involving a verb describing an action that was congruent or incongruent with the action presented in a prime picture that had been observed. The prime picture presented an action performed in a usual or an unusual context. The results revealed different behavioral and topographical pattern responses according to the context in which an action is presented. Importantly, only in the usual context, the congruency between the prime picture and the verb stimulus facilitated the semantic processes, leading to shorter response times in this condition compared to the others. Moreover, the topographic analysis revealed that this facilitation was related to reduced processing times for the semantic access to the verb and for the motor preparation for the answer. Taken together, these findings demonstrate that the context of an action is crucial in the link between action and language.

Action-Sentence Compatibility: The Role of Action Effects and Timing

Frontiers in Psychology, 2013

Research on embodied approaches to language comprehension suggests that we understand linguistic descriptions of actions by mentally simulating these actions. Evidence is provided by the action-sentence compatibility effect (ACE) which shows that sensibility judgments for sentences are faster when the direction of the described action matches the response direction. In two experiments, we investigated whether the ACE relies on actions or on intended action effects. Participants gave sensibility judgments of auditorily presented sentences by producing an action effect on a screen at a location near the body or far from the body. These action effects were achieved by pressing a response button that was located in either the same spatial direction as the action effect, or in the opposite direction. We used a go/no-go task in which the direction of the to-be-produced action effect was either cued at the onset of each sentence (Experiment 1) or at different points in time before and after sentence onset (Experiment 2). Overall, results showed a relationship between the direction of the described action and the direction of the action effect. Furthermore, Experiment 2 indicated that depending on the timing between cue presentation and sentence onset, participants responded either faster when the direction of the described action matched the direction of the action effect (positive ACE), or slower (negative ACE). These results provide evidence that the comprehension of action sentences involves the activation of representations of action effects. Concurrently activated representations in sentence comprehension and action planning can lead to both priming and interference, which is discussed in the context of the theory of event coding.

Spatiotemporal dynamics during processing of abstract and concrete verbs: An ERP study

Neuropsychologia, 2014

verbs Concrete verbs High density EEG ERPs Embodied theory a b s t r a c t Different accounts have been proposed to explain the nature of concept representations. Embodied accounts claim a key involvement of sensory-motor systems during semantic processing while more traditional accounts posit that concepts are abstract mental entities independent of perceptual and motor brain systems. While the involvement of sensory-motor areas in concrete language processing is supported by a large number of studies, this involvement is far from being established when considering abstract language. The present study addressed abstract and concrete verb processing, by investigating the spatiotemporal dynamics of evoked responses by means of high density EEG while participants performed a semantic decision task. In addition, RTs to the same set of stimuli were collected. In both early and late time intervals, ERP scalp topography significantly differed according to word categories. Concrete verbs showed involvement of parieto-frontal networks for action, according to the implied body effector. In contrast, abstract verbs recruited mostly frontal regions outside the motor system, suggesting a non-motor semantic processing for this category. In addition, differently from what has been reported during action observation, the parietal recruitment related to concrete verbs presentation followed the frontal one. The present findings suggest that action word semantic is grounded in sensorymotor systems, provided a bodily effector is specified, while abstract concepts' representation cannot be easily explained by a motor embodiment.

Embodied Cognition: Is Activation of the Motor Cortex Essential for Understanding Action Verbs?

Journal of Experimental Psychology: Learning, Memory, and Cognition

In 8 experiments using language processing tasks ranging from lexical decision to sensibility judgment, participants made hand or foot responses after reading hand-or foot-associated words such as action verbs. In general, response time (RT) tended to be faster when the hand-versus foot-associated word was compatible with the limb that was required to respond (e.g., hand response to a hand-associated word) than when it was incompatible (e.g., foot response to a hand-associated word). To see whether this compatibility effect reflects differential hand-versus foot-specific motor activation produced by the words, as suggested by some embodied theories of language understanding, we monitored 2 event-related potential (ERP) measures previously found to be sensitive to the activation of these limbs. As expected, the ERP results replicated previous findings that the monitored ERPs differ for hand versus foot movements. More importantly, the ERPs provided no evidence of any difference for hand-versus foot-associated words. Thus, the results weaken previous claims that the understanding of action verbs requires activation of the motor areas used to carry out the named action. Instead, they support claims that language-related compatibility effects on RT may arise prior to motor processes, which implies that such effects are not decisive evidence for embodied language understanding.

When syntax meets action: Brain potential evidence of overlapping between language and motor sequencing

This study aims to extend the embodied cognition approach to syntactic processing. The hypothesis is that the brain resources to plan and perform motor sequences are also involved in syntactic processing. To test this hypothesis, Event-Related brain Potentials (ERPs) were recorded while participants read sentences with embedded relative clauses, judging for their acceptability (half of the sentences contained a subject-verb morphosyntactic disagreement). The sentences, previously divided into three segments, were self-administered segment-by-segment in two different sequential manners: linear or non-linear. Linear self-administration consisted of successively pressing three buttons with three consecutive fingers in the right hand, while non-linear self-administration implied the substitution of the finger in the middle position by the right foot. Our aim was to test whether syntactic processing could be affected by the manner the sentences were self-administered. Main results revealed that the ERPs LAN component vanished whereas the P600 component increased in response to incorrect verbs, for non-linear relative to linear self-administration. The LAN and P600 components reflect early and late syntactic processing, respectively. Our results convey evidence that language syntactic processing and performing non-linguistic motor sequences may share resources in the human brain.

Action word understanding and overt motor behavior

Experimental Brain Research, 2009

Is the motor system involved in language processing? In order to clarify this issue, we carried out three behavioral experiments, using go-no-go and choice paradigms. In all the experiments, we used a semantic decision task with an early delivery of the go signal (during processing language material). Italian verbs expressing hand actions, foot actions or an abstract content served as stimuli. Participants executed intransitive (Experiment 1) or transitive (Experiment 2) actions with their right hand in response to the acoustic presentation of action-related verbs and refrained from responding to abstract verbs. The kinematics of the actions was slowed down by hand actionrelated verbs when compared with foot action-related verbs. In Experiment 3, hand-related and foot-related verbs were presented. Participants responded to hand-related and foot-related verbs with their hand and their foot (compatible condition) and in another block of trials they responded to hand-related and foot-related verbs with their foot and their hand (incompatible condition), respectively. In the compatible condition, the beginning of the action was faster, whereas the kinematics of the action was slower. The present Wndings suggest complete activation of verbrelated motor programs during language processing. The data are discussed in support of the hypothesis that this complete activation is necessary requisite to understand the exact meaning of action words because goal and consequence of the actions are represented.

The comprehension of action-related sentences may cause interference rather than facilitation on matching actions

This study used a dual-task paradigm to analyze the time course of motor resonance during the comprehension of action language. In the study, participants read sentences describing a transfer either away from ("I threw the tennis ball to my rival") or toward themselves ("My rival threw me the tennis ball"). When the transfer verb appeared on the screen, and after a variable stimulus onset asynchrony (SOA), a visual motion cue (Experiment 1) or a static cue (Experiment 2) prompted participants to move their hand either away from or toward themselves to press a button. The results showed meaning-action interference at short SOAs and facilitation at the longest SOA for the matching conditions. These results support the hypothesis that motor processes associated with the comprehension of action-related language interfere with an overlapping motor task, whereas they facilitate a delayed motor task. These eVects are discussed in terms of resonance processes in the motor cortex.

Motor activation in literal and non-literal sentences: does time matter?

Despite the impressive amount of evidence showing involvement of the sensorimotor systems in language processing, important questions remain unsolved among which the relationship between non-literal uses of language and sensorimotor activation. The literature did not yet provide a univocal answer on whether the comprehension of non-literal, abstract motion sentences engages the same neural networks recruited for literal sentences. A previous TMS study using the same experimental materials of the present study showed activation for literal, fictive and metaphoric motion sentences but not for idiomatic ones. To evaluate whether this may depend on insufficient time for elaborating the idiomatic meaning, we conducted a behavioral experiment that used a sensibility judgment task performed by pressing a button either with a hand finger or with a foot. Motor activation is known to be sensitive to the action-congruency of the effector used for responding. Therefore, all other things being equal, significant differences between response emitted with an action-congruent or incongruent effector (foot vs. hand) may be attributed to motor activation. Foot-related action verbs were embedded in sentences conveying literal motion, fictive motion, metaphoric motion or idiomatic motion. Mental sentences were employed as a control condition. foot responses were significantly faster than finger responses but only in literal motion sentences. We hypothesize that motor activation may arise in early phases of comprehension processes (i.e., upon reading the verb) for then decaying as a function of the strength of the semantic motion component of the verb.

Switching modalities in a sentence verification task: ERP evidence for embodied language processing

Frontiers in …, 2011

In an event related potential (ERP) experiment using written language materials only, we investigated a potential modulation of the N400 by the modality switch effect. The modality switch effect occurs when a first sentence, describing a fact grounded in one modality, is followed by a second sentence describing a second fact grounded in a different modality. For example, “A cellar is dark” (visual), was preceded by either another visual property “Ham is pink” or by a tactile property “A mitten is soft.” We also investigated whether the modality switch effect occurs for false sentences (“A cellar is light”). We found that, for true sentences, the ERP at the critical word “dark” elicited a significantly greater frontal, early N400-like effect (270–370ms) when there was a modality mismatch than when there was a modality-match. This pattern was not found for the critical word “light” in false sentences. Results similar to the frontal negativity were obtained in a late time window (500–700ms). The obtained ERP effect is similar to one previously obtained for pictures. We conclude that in this paradigm we obtained fast access to conceptual properties for modality-matched pairs, which leads to embodiment effects similar to those previously obtained with pictorial stimuli.