31 research outputs found
Tactile localization biases are modulated by gaze direction
Identifying the spatial location of touch on the skin surface is a fundamental function of our somatosensory system. Despite the fact that stimulation of even single mechanoreceptive afferent fibres is sufficient to produce clearly localised percepts, tactile localisation can be modulated also by higher-level processes such as body posture. This suggests that tactile events are coded using multiple representations using different coordinate systems. Recent reports provide evidence for systematic biases on tactile localisation task, which are thought to result from a supramodal representation of the skin surface. While the influence of non-informative vision of the body and gaze direction on tactile discrimination tasks has been extensively studied, their effects on tactile localisation tasks remain largely unexplored. To address this question, participants performed a tactile localization task on their left hand under different visual conditions by means of a mirror box; in the mirror condition a single stimulus was delivered on participants’ hand while the reflexion of the right hand was seen through the mirror; in the object condition participants looked at a box through the mirror, and in the right hand condition participants looked directly at their right hand. Participants reported the location of the tactile stimuli using a silhouette of a hand. Results showed a shift in the localization of the touches towards the tip of the fingers (distal bias) and the thumb (radial biases) across conditions. Critically, distal biases were reduced when participants looked towards the mirror compared to when they looked at their right hand suggesting that gaze direction reduces the typical proximo-distal biases in tactile localization. Moreover, vision of the hand modulates the internal configuration of points’ locations, by elongating it, in the radio-ulnar axis
Finger posture modulates structural body representations
Patients with lesions of the left posterior parietal cortex commonly fail in identifying their fingers, a condition known as finger agnosia, yet are relatively unimpaired in sensation and skilled action. Such dissociations have traditionally been interpreted as evidence that structural body representations (BSR), such as the body structural description, are distinct from sensorimotor representations, such as the body schema. We investigated whether performance on tasks commonly used to assess finger agnosia is modulated by changes in hand posture. We used the ‘in between’ test in which participants estimate the number of unstimulated fingers between two touched fingers or a localization task in which participants judge which two fingers were stimulated. Across blocks, the fingers were placed in three levels of splay. Judged finger numerosity was analysed, in Exp. 1 by direct report and in Exp. 2 as the actual number of fingers between the fingers named. In both experiments, judgments were greater when non-adjacent stimulated fingers were positioned far apart compared to when they were close
together or touching, whereas judgements were unaltered when adjacent fingers were stimulated. This demonstrates that BSRs are not fixed, but are modulated by the real-time physical distances between body parts
The effects of instrumental action on perceptual hand maps
Perceiving the external spatial location of body parts using position sense requires that immediate proprioceptive afferent signals be integrated with information about body size and shape. Longo and Haggard (Proc Natl Acad Sci USA 107:11727–11732, 2010) developed a method to measure perceptual hand maps reflecting this metric information about body size and shape. In this paradigm, participants indicate the perceived location of landmarks on their occluded hand by pointing with a long baton held in their other hand. By comparing the relative location of judgments of different hand landmarks, perceptual hand maps can be constructed and compared to actual hand structure. The maps show large and highly stereotyped distortions. Here, I investigated the potential effect of biases related to active motor control of the hand doing the pointing in these distortions. Participants localized the fingertip and knuckle of each finger on their occluded left hand either by actively pointing with a baton held in their right hand (pointing condition) or by giving verbal commands to an experimenter on how to move the baton (verbal condition). Similar distortions were clearly apparent in both conditions, suggesting that they are not an artifact of motor control biases related to the pointing hand
Distorted body representations are robust to differences in experimental instructions
Several recent reports have shown that even healthy adults maintain highly distorted representations of the size and shape of their body. These distortions have been shown to be highly consistent across different study designs and dependent measures. However, previous studies have found that visual judgments of size can be modulated by the experimental instructions used, for example, by asking for judgments of the participant’s subjective experience of stimulus size (i.e., apparent instructions) versus judgments of actual stimulus properties (i.e., objective instructions). Previous studies investigating internal body representations have relied exclusively on ‘apparent’ instructions. Here, we investigated whether apparent versus objective instructions modulate findings of distorted body representations underlying position sense (Exp. 1), tactile distance perception (Exp. 2), as well as the conscious body image (Exp. 3). Our results replicate the characteristic distortions previously reported for each of these tasks and further show that these distortions are not affected by instruction type (i.e., apparent vs. objective). These results show that the distortions measured with these paradigms are robust to differences in instructions and do not reflect a dissociation between perception and belief
Involvement of human primary somatosensory cortex in vibrotactile detection depends on task demand.
Recommended from our members
The contribution of primary and secondary somatosensory cortices to the representation of body parts and body sides: an fMRI adaptation study
Although the somatosensory homunculus is a classically used description of the way somatosensory inputs are processed in the brain, the actual contributions of primary (SI) and secondary (SII) somatosensory cortices to the spatial coding of touch remain poorly understood. We studied adaptation of the fMRI BOLD response in the somatosensory cortex by delivering pairs of vibrotactile stimuli to the finger tips of the index and middle fingers. The first stimulus (adaptor) was delivered either to the index or to the middle finger of the right or left hand, whereas the second stimulus (test) was always administered to the left index finger. The overall BOLD response evoked by the stimulation was primarily contralateral in SI and was more bilateral in SII. However, our fMRI adaptation approach also revealed that both somatosensory cortices were sensitive to ipsilateral as well as to contralateral inputs. SI and SII adapted more after subsequent stimulation of homologous as compared with nonhomologous fingers, showing a distinction between different fingers. Most importantly, for both somatosensory cortices, this finger-specific adaptation occurred irrespective of whether the tactile stimulus was delivered to the same or to different hands. This result implies integration of contralateral and ipsilateral somatosensory inputs in SI as well as in SII. Our findings suggest that SI is more than a simple relay for sensory information and that both SI and SII contribute to the spatial coding of touch by discriminating between body parts (fingers) and by integrating the somatosensory input from the two sides of the body (hands)
