Integration of vibrotactile frequency information beyond the mechanoreceptor channel and somatotopy (original) (raw)

A wide variety of tactile sensations arise from the activation of several types of mechanoreceptorafferent channels scattered all over the body, and their projections create a somatotopic map in the somatosensory cortex. Recent findings challenge the traditional view that tactile signals from different mechanoreceptor-channels/locations are independently processed in the brain, though the contribution of signal integration to perception remains obscure. Here we show that vibrotactile frequency perception is functionally enriched by signal integration across different mechanoreceptor channels and separate skin locations. When participants touched two sinusoidal vibrations of fardifferent frequency, which dominantly activated separate channels with the neighboring fingers or the different hand and judged the frequency of one vibration, the perceived frequency shifted toward the other (assimilation effect). Furthermore, when the participants judged the frequency of the pair as a whole, they consistently reported an intensity-based interpolation of the two vibrations (averaging effect). Both effects were similar in magnitude between the same and different hand conditions and significantly diminished by asynchronous presentation of the vibration pair. These findings indicate that human tactile processing is global and flexible in that it can estimate the ensemble property of a largescale tactile event sensed by various receptors distributed over the body. Touching an object produces a unique skin deformation reflecting mechanical interactions between the object and skin. The spatiotemporal features of skin deformation are neurally encoded in the activation patterns of multiple mechanoreceptor-afferent channels with distinct spatiotemporal tuning characteristics 1-5 , forming the basis of the content information ('what') of touch. The channels' responses at each skin location are sent to the central nervous system (the contralateral side of the cortex) with the somatotopic organization preserved 6, 7. The somatotopy forms the basis of the location information ('where') of touch. Given this structure, the fundamental question in tactile processing is how the brain processes these multi-channel inputs from multiple skin locations to compute final touch perception. Specifically, we are interested in whether inputs from different channels and those from different skin locations contribute to the final perception in an independent manner or in an integrated one. With regard to the inputs from different mechanoreceptor channels, a conventional view prefers independent processing, with each channel contributing to different aspects of the touch sensation 8-10. Psychophysically, the detection sensitivity of each channel is not affected by adaptation or masking of the other channels 2, 11-13. Physiologically, different tactile channels appear to be segregated from the periphery to, at least, the first stage of cortical processing, i.e., the primary somatosensory cortex (S1) 14-16. However, while the psychophysical evidence for integration of multiple channels for vibrofrequency perception is still equivocal and not decisive 17-19 , recent physiological studies have shown that singles neurons receive peripheral inputs from multiple channels even in S1 20-22. Regarding interactions of peripheral signals from different skin locations, several studies have suggested interactions beyond strict somatotopic mapping. Physiologically, some neurons in S1 or higher areas have multi-finger/ hand receptive fields, take projections from multiple peripheral neurons, and show inhibition/facilitation effects among them 23-29. Psychophysical masking occurs even when similar input signals are presented to skin locations