Elena Aggius-Vella - Academia.edu (original) (raw)
Papers by Elena Aggius-Vella
Children
Spatial representation is a crucial skill for everyday interaction with the environment. Differen... more Spatial representation is a crucial skill for everyday interaction with the environment. Different factors seem to influence spatial perception, such as body movements and vision. However, it is still unknown if motor impairment affects the building of simple spatial perception. To investigate this point, we tested hemiplegic children with (HV) and without visual field (H) disorders in an auditory and visual-spatial localization and pitch discrimination task. Fifteen hemiplegic children (nine H and six HV) and twenty with typical development took part in the experiment. The tasks consisted in listening to a sound coming from a series of speakers positioned at the front or back of the subject. In one condition, subjects were asked to discriminate the pitch, while in the other, subjects had to localize the position of the sound. We also replicated the spatial task in a visual modality. Both groups of hemiplegic children performed worse in the auditory spatial localization task compare...
Perception, 2021
When vision is unavailable, auditory level and reverberation cues provide important spatial infor... more When vision is unavailable, auditory level and reverberation cues provide important spatial information regarding the environment, such as the size of a room. We investigated how room-size estimates were affected by stimulus type, level, and reverberation. In Experiment 1, 15 blindfolded participants estimated room size after performing a distance bisection task in virtual rooms that were either anechoic (with level cues only) or reverberant (with level and reverberation cues) with a relatively short reverberation time of T60 = 400 milliseconds. Speech, noise, or clicks were presented at distances between 1.9 and 7.1 m. The reverberant room was judged to be significantly larger than the anechoic room ( p < .05) for all stimuli. In Experiment 2, only the reverberant room was used and the overall level of all sounds was equalized, so only reverberation cues were available. Ten blindfolded participants took part. Room-size estimates were significantly larger for speech than for cli...
Neuroscience & Biobehavioral Reviews, 2016
The last quarter of a century has seen a dramatic rise of interest in the development of technolo... more The last quarter of a century has seen a dramatic rise of interest in the development of technological solutions for visually impaired people. However, despite the presence of many devices, user acceptance is low. Not only are visually impaired adults not using these devices but they are also too complex for children. The majority of these devices have been developed without considering either the brain mechanisms underlying the deficit or the natural ability of the brain to process information. Most of them use complex feedback systems and overwhelm sensory, attentional and memory capacities. Here we review the neuroscientific studies on orientation and mobility in visually impaired adults and children and present the technological devices developed so far to improve locomotion skills. We also discuss how we think these solutions could be improved. We hope that this paper may be of interest to neuroscientists and technologists and it will provide a common background to develop new science-driven technology, more accepted by visually impaired adults and suitable for children with visual disabilities.
Hearing Research, 2022
The distance of sound sources relative to the body can be estimated using acoustic level and dire... more The distance of sound sources relative to the body can be estimated using acoustic level and direct-toreverberant ratio cues. However, the ability to do this may differ for sounds that are in front compared to behind the listener. One reason for this is that vision, which plays an important role in calibrating auditory distance cues early in life, is unavailable for rear space. Furthermore, the filtering of sounds by the pinnae differs if they originate from the front compared to the back. We investigated auditory distance discrimination in front and rear space by comparing performance for auditory spatial bisection of distance and minimum audible distance discrimination (MADD) tasks. In the bisection task, participants heard three successive bursts of noise at three different distances and indicated whether the second sound (probe) was closer in space to the first or third sound (references). In the MADD task, participants reported which of two successive sounds was closer. An analysis of variance with factors task and region of space showed worse performance for rear than for front space, but no significant interaction between task and region of space. For the bisection task, the point of subjective equality (PSE) was slightly biased towards the body, but the absolute magnitude of the PSE did not differ between front and rear space. These results are consistent with the hypothesis that visual information is important in calibrating the auditory representation of front space in distance early in life. © 2022 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license ( http://creativecommons.org/licenses/by/4.0/ )
Hearing Research, 2022
The distance of sound sources relative to the body can be estimated using acoustic level and dire... more The distance of sound sources relative to the body can be estimated using acoustic level and direct-toreverberant ratio cues. However, the ability to do this may differ for sounds that are in front compared to behind the listener. One reason for this is that vision, which plays an important role in calibrating auditory distance cues early in life, is unavailable for rear space. Furthermore, the filtering of sounds by the pinnae differs if they originate from the front compared to the back. We investigated auditory distance discrimination in front and rear space by comparing performance for auditory spatial bisection of distance and minimum audible distance discrimination (MADD) tasks. In the bisection task, participants heard three successive bursts of noise at three different distances and indicated whether the second sound (probe) was closer in space to the first or third sound (references). In the MADD task, participants reported which of two successive sounds was closer. An analysis of variance with factors task and region of space showed worse performance for rear than for front space, but no significant interaction between task and region of space. For the bisection task, the point of subjective equality (PSE) was slightly biased towards the body, but the absolute magnitude of the PSE did not differ between front and rear space. These results are consistent with the hypothesis that visual information is important in calibrating the auditory representation of front space in distance early in life. © 2022 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license ( http://creativecommons.org/licenses/by/4.0/ )
I hereby declare that except where specific reference is made to the work of others, the contents... more I hereby declare that except where specific reference is made to the work of others, the contents of this dissertation are original and have not been submitted in whole or in part for consideration for any other degree or qualification in this, or any other university. This dissertation is my own work and contains nothing which is the outcome of work done in collaboration with others, except as specified in the text and Acknowledgements. This dissertation contains fewer than 65,000 words including appendices, bibliography, footnotes, tables and equations and has fewer than 150 figures.
Neuropsychologia, 2020
In measuring behavioural and pupillary responses to auditory oddball stimuli delivered in the fro... more In measuring behavioural and pupillary responses to auditory oddball stimuli delivered in the front and rear peri-personal space, we find that pupils dilate in response to rare stimuli, both target and distracters. Dilation in response to targets is stronger than the response to distracters, implying a task relevance effect on pupil responses. Crucially, pupil dilation in response to targets is also selectively modulated by the location of sound sources: stronger in the front than in the rear peri-personal space, in spite of matching behavioural performance. This supports the concept that even non-spatial skills, such as the ability to alert in response to behaviourally relevant events, are differentially engaged across subregions of the peri-personal space.
Scientific Reports
Although vision is important for calibrating auditory spatial perception, it only provides inform... more Although vision is important for calibrating auditory spatial perception, it only provides information about frontal sound sources. Previous studies of blind and sighted people support the idea that azimuthal spatial bisection in frontal space requires visual calibration, while detection of a change in azimuth (minimum audible angle, MAA) does not. The influence of vision on the ability to map frontal, lateral and back space has not been investigated. Performance in spatial bisection and MAA tasks was assessed for normally sighted blindfolded subjects using bursts of white noise presented frontally, laterally, or from the back relative to the subjects. Thresholds for both tasks were similar in frontal space, lower for the MAA task than for the bisection task in back space, and higher for the MAA task in lateral space. Two interpretations of the results are discussed, one in terms of visual calibration and the use of internal representations of source location and the other based on ...
Scientific Reports, Jun 20, 2018
Vision seems to have a pivotal role in developing spatial cognition. A recent approach, based on ... more Vision seems to have a pivotal role in developing spatial cognition. A recent approach, based on sensory calibration, has highlighted the role of vision in calibrating hearing in spatial tasks. It was shown that blind individuals have specific impairments during audio spatial bisection tasks. Vision is available only in the frontal space, leading to a "natural" blindness in the back. If vision is important for audio space calibration, then the auditory frontal space should be better represented than the back auditory space. In this study, we investigated this point by comparing frontal and back audio spatial metric representations. We measured precision in the spatial bisection task, for which vision seems to be fundamental to calibrate audition, in twenty-three sighted subjects. Two control tasks, a minimum audible angle and a temporal bisection were employed in order to evaluate auditory precision in the different regions considered. While no differences were observed between frontal and back space in the minimum audible angle (MAA) and temporal bisection task, a significant difference was found in the spatial bisection task, where subjects performed better in the frontal space. Our results are in agreement with the idea that vision is important in developing auditory spatial metric representation in sighted individuals. To date, most of the experiments testing auditory spatial representation have focused on the frontal space, mostly at ear level. However, several pieces of research have shown that the space around the body is not coded as unitary dimension, but it is split into multiple regions in which stimuli are differently perceived. The major divisions are between far and near space 1-4 , space around different body regions 5-7 , and left and right, as shown in neglect patients 8. A less investigated space is the back zone. This zone is particularly interesting, as vision and action are not easily available there. The back space might be differently processed from the frontal space, as suggested by studies on neglect 9-11 and on multisensory integration 12,13 , where a facilitatory effect of audio tactile stimuli interactions was found in the rear space 12. An interesting experiment on temporal order judgment 14 showed that the spatiotemporal representation of non-visual stimuli in front and rear space is different. In this experiment, stimuli were delivered on crossed and uncrossed hands in two different spaces: frontal and rear space. Results showed a significantly better performance in the crossed posture, when the hands were placed in the back space, than in the frontal space. This finding is particularly interesting, as previous experiments on blind and sighted people showed that congenitally blind individuals do not demonstrate any impairment in tactile temporal order judgment tasks (TOJs) as a result of crossing their hands 15,16. It is possible to speculate, therefore, that underlying mechanisms similar to those adopted by blind people, might also be adopted by sighted people in space where vision is not available 14. Interesting, another study 17 showed that, the impairment during crossed posture, is still present around the legs (ankles). The study revealed that audio, visual information, and their integration, differently affect the localization of tactile stimuli. This suggests that mechanisms, controlling the alignment between somatotopic and external reference frames, include spatial features conveyed by the auditory and visual modalities. To date, it is not clear how vision differently influences the perception of stimuli in spaces around us. If vision is important in spatial representation, then we might expect visual and non-visual spaces to be differently represented. On one hand, the sensory compensation hypothesis 18,19 states that the lack of a sensory ability (e.g. vision), leads to an improved ability of non-visual senses. Studies on blind subjects support the idea that this enhanced auditory ability is due to cross-modal plasticity 20,21. The visual cortex is highly plastic; this is more evident in young animals, but it is still present in adulthood 22. This plasticity allows the visual cortex in congenitally blind people to become colonized by other sensory systems (i.e. auditory and somatosensory) 23,24. Few days of binocular deprivation is sufficient for the primary visual cortex to be colonized by touch 25. There is also psychophysical evidence that the congenitally blind have enhanced tactile discrimination 26 , auditory pitch discrimination 27 , sound localization 28,29 , and are able to properly form spatial topographical maps 30,31. Spatial hearing tasks have been shown to activate
Brain and cognition, 2017
Material related to the self, as well as to significant others, often displays mnemonic superiori... more Material related to the self, as well as to significant others, often displays mnemonic superiority through its associations with highly organised and elaborate representations. Neuroimaging studies suggest this effect is related to activation in regions of medial prefrontal cortex (mPFC). Incidental memory scores for trait adjectives, processed in relation to the self, a good friend and David Cameron were collected. Scores for each referent were used as regressors in seed-based analyses of resting state fMRI data performed in ventral, middle and dorsal mPFC seeds, as well as hippocampal formation. Stronger memory for self-processed items was predicted by functional connnectivity between ventral mPFC, angular gyrus and middle temporal gyri. These regions are within the default mode network, linked to relatively automatic aspects of memory retrieval. In contrast, memory for items processed in relation to best friends, was better in individuals whose ventral mPFC showed relatively wea...
Frontiers in Integrative Neuroscience
Spatial representation is developed thanks to the integration of visual signals with the other se... more Spatial representation is developed thanks to the integration of visual signals with the other senses. It has been shown that the lack of vision compromises the development of some spatial representations. In this study we tested the effect of a new rehabilitation device called ABBI (Audio Bracelet for Blind Interaction) to improve space representation. ABBI produces an audio feedback linked to body movement. Previous studies from our group showed that this device improves the spatial representation of space in early blind adults around the upper part of the body. Here we evaluate whether the audio motor feedback produced by ABBI can also improve audio spatial representation of sighted individuals in the space around the legs. Forty five blindfolded sighted subjects participated in the study, subdivided into three experimental groups. An audio space localization (front-back discrimination) task was performed twice by all groups of subjects before and after different kind of training conditions. A group (experimental) performed an audio-motor training with the ABBI device placed on their foot. Another group (control) performed a free motor activity without audio feedback associated with body movement. The other group (control) passively listened to the ABBI sound moved at foot level by the experimenter without producing any body movement. Results showed that only the experimental group, which performed the training with the audio-motor feedback, showed an improvement in accuracy for sound discrimination. No improvement was observed for the two control groups. These findings suggest that the audio-motor training with ABBI improves audio space perception also in the space around the legs in sighted individuals. This result provides important inputs for the rehabilitation of the space representations in the lower part of the body.
Frontiers in psychology, 2017
Studies have found that portions of space around our body are differently coded by our brain. Num... more Studies have found that portions of space around our body are differently coded by our brain. Numerous works have investigated visual and auditory spatial representation, focusing mostly on the spatial representation of stimuli presented at head level, especially in the frontal space. Only few studies have investigated spatial representation around the entire body and its relationship with motor activity. Moreover, it is still not clear whether the space surrounding us is represented as a unitary dimension or whether it is split up into different portions, differently shaped by our senses and motor activity. To clarify these points, we investigated audio localization of dynamic and static sounds at different body levels. In order to understand the role of a motor action in auditory space representation, we asked subjects to localize sounds by pointing with the hand or the foot, or by giving a verbal answer. We found that the audio sound localization was different depending on the bo...
Children
Spatial representation is a crucial skill for everyday interaction with the environment. Differen... more Spatial representation is a crucial skill for everyday interaction with the environment. Different factors seem to influence spatial perception, such as body movements and vision. However, it is still unknown if motor impairment affects the building of simple spatial perception. To investigate this point, we tested hemiplegic children with (HV) and without visual field (H) disorders in an auditory and visual-spatial localization and pitch discrimination task. Fifteen hemiplegic children (nine H and six HV) and twenty with typical development took part in the experiment. The tasks consisted in listening to a sound coming from a series of speakers positioned at the front or back of the subject. In one condition, subjects were asked to discriminate the pitch, while in the other, subjects had to localize the position of the sound. We also replicated the spatial task in a visual modality. Both groups of hemiplegic children performed worse in the auditory spatial localization task compare...
Perception, 2021
When vision is unavailable, auditory level and reverberation cues provide important spatial infor... more When vision is unavailable, auditory level and reverberation cues provide important spatial information regarding the environment, such as the size of a room. We investigated how room-size estimates were affected by stimulus type, level, and reverberation. In Experiment 1, 15 blindfolded participants estimated room size after performing a distance bisection task in virtual rooms that were either anechoic (with level cues only) or reverberant (with level and reverberation cues) with a relatively short reverberation time of T60 = 400 milliseconds. Speech, noise, or clicks were presented at distances between 1.9 and 7.1 m. The reverberant room was judged to be significantly larger than the anechoic room ( p < .05) for all stimuli. In Experiment 2, only the reverberant room was used and the overall level of all sounds was equalized, so only reverberation cues were available. Ten blindfolded participants took part. Room-size estimates were significantly larger for speech than for cli...
Neuroscience & Biobehavioral Reviews, 2016
The last quarter of a century has seen a dramatic rise of interest in the development of technolo... more The last quarter of a century has seen a dramatic rise of interest in the development of technological solutions for visually impaired people. However, despite the presence of many devices, user acceptance is low. Not only are visually impaired adults not using these devices but they are also too complex for children. The majority of these devices have been developed without considering either the brain mechanisms underlying the deficit or the natural ability of the brain to process information. Most of them use complex feedback systems and overwhelm sensory, attentional and memory capacities. Here we review the neuroscientific studies on orientation and mobility in visually impaired adults and children and present the technological devices developed so far to improve locomotion skills. We also discuss how we think these solutions could be improved. We hope that this paper may be of interest to neuroscientists and technologists and it will provide a common background to develop new science-driven technology, more accepted by visually impaired adults and suitable for children with visual disabilities.
Hearing Research, 2022
The distance of sound sources relative to the body can be estimated using acoustic level and dire... more The distance of sound sources relative to the body can be estimated using acoustic level and direct-toreverberant ratio cues. However, the ability to do this may differ for sounds that are in front compared to behind the listener. One reason for this is that vision, which plays an important role in calibrating auditory distance cues early in life, is unavailable for rear space. Furthermore, the filtering of sounds by the pinnae differs if they originate from the front compared to the back. We investigated auditory distance discrimination in front and rear space by comparing performance for auditory spatial bisection of distance and minimum audible distance discrimination (MADD) tasks. In the bisection task, participants heard three successive bursts of noise at three different distances and indicated whether the second sound (probe) was closer in space to the first or third sound (references). In the MADD task, participants reported which of two successive sounds was closer. An analysis of variance with factors task and region of space showed worse performance for rear than for front space, but no significant interaction between task and region of space. For the bisection task, the point of subjective equality (PSE) was slightly biased towards the body, but the absolute magnitude of the PSE did not differ between front and rear space. These results are consistent with the hypothesis that visual information is important in calibrating the auditory representation of front space in distance early in life. © 2022 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license ( http://creativecommons.org/licenses/by/4.0/ )
Hearing Research, 2022
The distance of sound sources relative to the body can be estimated using acoustic level and dire... more The distance of sound sources relative to the body can be estimated using acoustic level and direct-toreverberant ratio cues. However, the ability to do this may differ for sounds that are in front compared to behind the listener. One reason for this is that vision, which plays an important role in calibrating auditory distance cues early in life, is unavailable for rear space. Furthermore, the filtering of sounds by the pinnae differs if they originate from the front compared to the back. We investigated auditory distance discrimination in front and rear space by comparing performance for auditory spatial bisection of distance and minimum audible distance discrimination (MADD) tasks. In the bisection task, participants heard three successive bursts of noise at three different distances and indicated whether the second sound (probe) was closer in space to the first or third sound (references). In the MADD task, participants reported which of two successive sounds was closer. An analysis of variance with factors task and region of space showed worse performance for rear than for front space, but no significant interaction between task and region of space. For the bisection task, the point of subjective equality (PSE) was slightly biased towards the body, but the absolute magnitude of the PSE did not differ between front and rear space. These results are consistent with the hypothesis that visual information is important in calibrating the auditory representation of front space in distance early in life. © 2022 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license ( http://creativecommons.org/licenses/by/4.0/ )
I hereby declare that except where specific reference is made to the work of others, the contents... more I hereby declare that except where specific reference is made to the work of others, the contents of this dissertation are original and have not been submitted in whole or in part for consideration for any other degree or qualification in this, or any other university. This dissertation is my own work and contains nothing which is the outcome of work done in collaboration with others, except as specified in the text and Acknowledgements. This dissertation contains fewer than 65,000 words including appendices, bibliography, footnotes, tables and equations and has fewer than 150 figures.
Neuropsychologia, 2020
In measuring behavioural and pupillary responses to auditory oddball stimuli delivered in the fro... more In measuring behavioural and pupillary responses to auditory oddball stimuli delivered in the front and rear peri-personal space, we find that pupils dilate in response to rare stimuli, both target and distracters. Dilation in response to targets is stronger than the response to distracters, implying a task relevance effect on pupil responses. Crucially, pupil dilation in response to targets is also selectively modulated by the location of sound sources: stronger in the front than in the rear peri-personal space, in spite of matching behavioural performance. This supports the concept that even non-spatial skills, such as the ability to alert in response to behaviourally relevant events, are differentially engaged across subregions of the peri-personal space.
Scientific Reports
Although vision is important for calibrating auditory spatial perception, it only provides inform... more Although vision is important for calibrating auditory spatial perception, it only provides information about frontal sound sources. Previous studies of blind and sighted people support the idea that azimuthal spatial bisection in frontal space requires visual calibration, while detection of a change in azimuth (minimum audible angle, MAA) does not. The influence of vision on the ability to map frontal, lateral and back space has not been investigated. Performance in spatial bisection and MAA tasks was assessed for normally sighted blindfolded subjects using bursts of white noise presented frontally, laterally, or from the back relative to the subjects. Thresholds for both tasks were similar in frontal space, lower for the MAA task than for the bisection task in back space, and higher for the MAA task in lateral space. Two interpretations of the results are discussed, one in terms of visual calibration and the use of internal representations of source location and the other based on ...
Scientific Reports, Jun 20, 2018
Vision seems to have a pivotal role in developing spatial cognition. A recent approach, based on ... more Vision seems to have a pivotal role in developing spatial cognition. A recent approach, based on sensory calibration, has highlighted the role of vision in calibrating hearing in spatial tasks. It was shown that blind individuals have specific impairments during audio spatial bisection tasks. Vision is available only in the frontal space, leading to a "natural" blindness in the back. If vision is important for audio space calibration, then the auditory frontal space should be better represented than the back auditory space. In this study, we investigated this point by comparing frontal and back audio spatial metric representations. We measured precision in the spatial bisection task, for which vision seems to be fundamental to calibrate audition, in twenty-three sighted subjects. Two control tasks, a minimum audible angle and a temporal bisection were employed in order to evaluate auditory precision in the different regions considered. While no differences were observed between frontal and back space in the minimum audible angle (MAA) and temporal bisection task, a significant difference was found in the spatial bisection task, where subjects performed better in the frontal space. Our results are in agreement with the idea that vision is important in developing auditory spatial metric representation in sighted individuals. To date, most of the experiments testing auditory spatial representation have focused on the frontal space, mostly at ear level. However, several pieces of research have shown that the space around the body is not coded as unitary dimension, but it is split into multiple regions in which stimuli are differently perceived. The major divisions are between far and near space 1-4 , space around different body regions 5-7 , and left and right, as shown in neglect patients 8. A less investigated space is the back zone. This zone is particularly interesting, as vision and action are not easily available there. The back space might be differently processed from the frontal space, as suggested by studies on neglect 9-11 and on multisensory integration 12,13 , where a facilitatory effect of audio tactile stimuli interactions was found in the rear space 12. An interesting experiment on temporal order judgment 14 showed that the spatiotemporal representation of non-visual stimuli in front and rear space is different. In this experiment, stimuli were delivered on crossed and uncrossed hands in two different spaces: frontal and rear space. Results showed a significantly better performance in the crossed posture, when the hands were placed in the back space, than in the frontal space. This finding is particularly interesting, as previous experiments on blind and sighted people showed that congenitally blind individuals do not demonstrate any impairment in tactile temporal order judgment tasks (TOJs) as a result of crossing their hands 15,16. It is possible to speculate, therefore, that underlying mechanisms similar to those adopted by blind people, might also be adopted by sighted people in space where vision is not available 14. Interesting, another study 17 showed that, the impairment during crossed posture, is still present around the legs (ankles). The study revealed that audio, visual information, and their integration, differently affect the localization of tactile stimuli. This suggests that mechanisms, controlling the alignment between somatotopic and external reference frames, include spatial features conveyed by the auditory and visual modalities. To date, it is not clear how vision differently influences the perception of stimuli in spaces around us. If vision is important in spatial representation, then we might expect visual and non-visual spaces to be differently represented. On one hand, the sensory compensation hypothesis 18,19 states that the lack of a sensory ability (e.g. vision), leads to an improved ability of non-visual senses. Studies on blind subjects support the idea that this enhanced auditory ability is due to cross-modal plasticity 20,21. The visual cortex is highly plastic; this is more evident in young animals, but it is still present in adulthood 22. This plasticity allows the visual cortex in congenitally blind people to become colonized by other sensory systems (i.e. auditory and somatosensory) 23,24. Few days of binocular deprivation is sufficient for the primary visual cortex to be colonized by touch 25. There is also psychophysical evidence that the congenitally blind have enhanced tactile discrimination 26 , auditory pitch discrimination 27 , sound localization 28,29 , and are able to properly form spatial topographical maps 30,31. Spatial hearing tasks have been shown to activate
Brain and cognition, 2017
Material related to the self, as well as to significant others, often displays mnemonic superiori... more Material related to the self, as well as to significant others, often displays mnemonic superiority through its associations with highly organised and elaborate representations. Neuroimaging studies suggest this effect is related to activation in regions of medial prefrontal cortex (mPFC). Incidental memory scores for trait adjectives, processed in relation to the self, a good friend and David Cameron were collected. Scores for each referent were used as regressors in seed-based analyses of resting state fMRI data performed in ventral, middle and dorsal mPFC seeds, as well as hippocampal formation. Stronger memory for self-processed items was predicted by functional connnectivity between ventral mPFC, angular gyrus and middle temporal gyri. These regions are within the default mode network, linked to relatively automatic aspects of memory retrieval. In contrast, memory for items processed in relation to best friends, was better in individuals whose ventral mPFC showed relatively wea...
Frontiers in Integrative Neuroscience
Spatial representation is developed thanks to the integration of visual signals with the other se... more Spatial representation is developed thanks to the integration of visual signals with the other senses. It has been shown that the lack of vision compromises the development of some spatial representations. In this study we tested the effect of a new rehabilitation device called ABBI (Audio Bracelet for Blind Interaction) to improve space representation. ABBI produces an audio feedback linked to body movement. Previous studies from our group showed that this device improves the spatial representation of space in early blind adults around the upper part of the body. Here we evaluate whether the audio motor feedback produced by ABBI can also improve audio spatial representation of sighted individuals in the space around the legs. Forty five blindfolded sighted subjects participated in the study, subdivided into three experimental groups. An audio space localization (front-back discrimination) task was performed twice by all groups of subjects before and after different kind of training conditions. A group (experimental) performed an audio-motor training with the ABBI device placed on their foot. Another group (control) performed a free motor activity without audio feedback associated with body movement. The other group (control) passively listened to the ABBI sound moved at foot level by the experimenter without producing any body movement. Results showed that only the experimental group, which performed the training with the audio-motor feedback, showed an improvement in accuracy for sound discrimination. No improvement was observed for the two control groups. These findings suggest that the audio-motor training with ABBI improves audio space perception also in the space around the legs in sighted individuals. This result provides important inputs for the rehabilitation of the space representations in the lower part of the body.
Frontiers in psychology, 2017
Studies have found that portions of space around our body are differently coded by our brain. Num... more Studies have found that portions of space around our body are differently coded by our brain. Numerous works have investigated visual and auditory spatial representation, focusing mostly on the spatial representation of stimuli presented at head level, especially in the frontal space. Only few studies have investigated spatial representation around the entire body and its relationship with motor activity. Moreover, it is still not clear whether the space surrounding us is represented as a unitary dimension or whether it is split up into different portions, differently shaped by our senses and motor activity. To clarify these points, we investigated audio localization of dynamic and static sounds at different body levels. In order to understand the role of a motor action in auditory space representation, we asked subjects to localize sounds by pointing with the hand or the foot, or by giving a verbal answer. We found that the audio sound localization was different depending on the bo...