Developmental Science 17:6 (2014), pp 944–945

DOI: 10.1111/desc.12186

COMMENTARY Development of the spatial coding of touch: ability vs. automaticity €der, Tobias Heed and Stephanie Badde Brigitte Ro Biological Psychology and Neuropsychology, University of Hamburg, Germany

This is a commentary on Begum Ali et al. (2014). Localization of touch in 3D space is essential for the integration with information from other sensory modalities, as well as for planning actions towards the source of the touch (Pouget, Ducom, Torri & Bavelier, 2002; Sober & Sabes, 2005). In initial processing stages, as in the primary somatosensory cortex, touch is represented in a somatotopic (that is, skin-based) reference frame. The process of transforming touch representations from this modality-specific coordinate system into an external reference frame has been called tactile remapping (Driver & Spence, 1998). Different lines of research have suggested that remapping proceeds automatically in n, Camacho & Sotoadults (Kitazawa, 2002; Aza~ no Faraco, 2010; Badde, Heed & R€ oder, 2014). Tactile remapping has often been investigated with the tactile temporal order judgment (TOJ) task, in which two successively presented tactile stimuli, one presented to either hand, must be ordered in time by indicating which hand was stimulated first. Adults are markedly impaired in this task when they adopt a crossed rather than a parallel hand posture (Yamamoto & Kitazawa, 2001; Shore, Spry & Spence, 2002). This crossing effect has been attributed to a conflict between the somatotopic and the external reference frames of touch: with crossed hands, the right hand (somatotopic reference frame) lies in the left hemispace (external reference frame) (Shore et al., 2002; Badde, Heed & R€ oder, 2013). Because it would be possible to solve the TOJ task based on somatotopic coordinates alone, the crossing effect has been interpreted as indicating that tactile remapping is initiated automatically, independent of task demands (Kitazawa, 2002). Pagel and colleagues (2009) administered a childadapted TOJ task to children aged between 5 and 10 years. To avoid the use of left and right labels for responding, cat and dog stickers were attached to the

hands. Participants indicated the location of the first touch by saying ‘cat’ or ‘dog’. Because the stickers referred to a hand (not to the left and right sides of space), responses were coded in somatotopic space, and the calculation of external coordinates was not required. Crossing effects, indicative of automatic tactile remapping, were evident starting at age 5½–6, but not before. In their recent study, Begum Ali, Cowie and Bremner (this volume) used a related paradigm in children aged 4–6 years. When vision of the hands was prevented, a crossing effect was evident in all age groups, seemingly contradicting Pagel and colleagues’ findings. However, Begum Ali and colleagues’ paradigm differed in two important aspects from the earlier study: first, rather than making TOJ of two stimuli, children localized a single touch. This task is considerably easier and results in overall higher performance compared to the TOJ task (Badde, Heed & R€ oder, 2012). Second, the children in Begum Ali and colleagues’ study placed their hands in a box with two plush toys mounted on top of the box, one on the left side, one on the right side. Participants had to say which toy had tickled their finger. Crucially, the placement of the plush toys was fixed, so that the assignment of toy to hand changed from the uncrossed to the crossed posture. Thus, participants had to localize touch with respect to external space, requiring an explicit computation of external tactile coordinates to solve the task. This task thus tested children’s ability to remap touch on demand. In contrast, Pagel and colleagues tested whether touch is recoded automatically even when not required. In Begum Ali and collaborators’ single touch task, children should perform worse with crossed hands the less well they are able to compute the external tactile coordinates. In contrast, the somatotopic coordinates could be used with uncrossed hands (because both a

Address for correspondence: Brigitte R€ oder, University of Hamburg, Biological Psychology and Neuropsychology, Von-Melle-Park 11, D-20146 Hamburg, Germany; e-mail [email protected]

© 2014 John Wiley & Sons Ltd

Commentary

somatotopic and an external reference frame point to the same hand). Consequently, the younger children’s crossing effect implies that they are better at localizing touch in somatotopic than in external space. Indeed, with increasing age, the crossing effect decreased, suggesting improved external localization performance. Viewing the two studies together, younger children have the (limited) ability to localize touch in external space, but they do not do so automatically (Pagel et al., 2009). This interpretation is substantiated by studies which investigated development with a visual deprivation approach. Congenitally blind adults did not exhibit a TOJ crossing effect, whereas late blind adults did (R€ oder, R€ osler & Spence, 2004). These findings imply that developmental vision is essential for the automatic use of external coordinates in touch localization and some other tasks (Crollen, Dormal, Seron, Lepore & Collignon, 2013). Nevertheless, congenitally blind individuals are able to localize their hands in external space when required. For example, when responding to left and right tones with the hand nearest to the sound, they performed well both with uncrossed and with crossed hands. Yet, from the uncrossed- to crossed-hands condition, congenitally blind adults showed a more pronounced reaction time increase than sighted and late blind individuals (R€ oder, Kusmierek, Spence & Schicke, 2007). Thus, although congenitally blind individuals do not automatically rely on external coordinates, they do so at additional processing cost when mandatory. In sum, the ability to externally localize touch begins to develop early, possibly in the first year of life (Bremner, Mareschal, Lloyd-Fox & Spence, 2008) and improves continually in young age (Begum Ali et al., this volume). However, the automatic use of external coordinates for touch localization emerges later, that is, not before the age of 5½.

References n, E., Camacho, K., & Soto-Faraco, S. (2010). Tactile Aza~ no remapping beyond space. European Journal of Neuroscience, 31 (10), 1858–1867. doi:10.1111/j.1460-9568.2010.07233.x Badde, S., Heed, T., & R€ oder, B. (2012). Touch remapping is automatic but top-down modulated. Abstract. Annual Meeting of the Society for Neuroscience, New Orleans, USA, 13–17 October. Badde, S., Heed, T., & R€ oder, B. (2013). Modelling body posture effects on reference frame integration. Abstract. 14th

© 2014 John Wiley & Sons Ltd

945

International Multisensory Research Forum, Jerusalem, Israel, 3–6 May. Badde, S., Heed, T., & R€ oder, B. (2014). Processing load impairs coordinate integration for the localization of touch. Attention, Perception, & Psychophysics, 74, 1302–1311. Begum Ali, J., Cowie, D., & Bremner A.J. (2014). Effects of posture on tactile localization by 4 years of age are modulated by sight of the hands: evidence for an early acquired external spatial frame of reference for touch. Developmental Science, this issue. Bremner, A.J., Mareschal, D., Lloyd-Fox, S., & Spence, C. (2008). Spatial localization of touch in the first year of life: early influence of a visual spatial code and the development of remapping across changes in limb position. Journal of Experimental Psychology: General, 137 (1), 149–162. doi:10. 1037/0096-3445.137.1.149 Crollen, V., Dormal, G., Seron, X., Lepore, F., & Collignon, O. (2013). Embodied numbers: the role of vision in the development of number–space interactions. Cortex, 49 (1), 276–283. Driver, J., & Spence, C. (1998). Cross-modal links in spatial attention. Philosophical Transactions of the Royal Society B: Biological Sciences, 353 (1373), 1319–1331. doi:10.1098/rstb. 1998.0286 Kitazawa, S. (2002). Where conscious sensation takes place. Consciousness and Cognition, 11 (3), 475–477. doi:10.1016/ S1053-8100(02)00031-4 Pagel, B., Heed, T., & R€ oder, B. (2009). Change of reference frame for tactile localization during child development. Developmental Science, 12 (6), 929–937. doi:10.1111/j. 1467-7687.2009.00845.x Pouget, A., Ducom, J.C., Torri, J., & Bavelier, D. (2002). Multisensory spatial representations in eye-centered coordinates for reaching. Cognition, 83 (1), B1–B11. doi:10.1016/ S0010-0277(01)00163-9 R€ oder, B., Kusmierek, A., Spence, C., & Schicke, T. (2007). Developmental vision determines the reference frame for the multisensory control of action. Proceedings of the National Academy of Sciences, USA, 104 (11), 4753–4758. doi:10.1073/ pnas.0607158104 R€ oder, B., R€ osler, F., & Spence, C. (2004). Early vision impairs tactile perception in the blind. Current Biology, 14 (2), 121– 124. doi:10.1016/S0960-9822(03)00984-9 Shore, D.I., Spry, E., & Spence, C. (2002). Confusing the mind by crossing the hands. Cognitive Brain Research, 14 (1), 153– 163. doi:10.1016/S0926-6410(02)00070-8 Sober, S.J., & Sabes, P.N. (2005). Flexible strategies for sensory integration during motor planning. Nature Neuroscience, 8 (4), 490–497. doi:10.1038/nn1427 Yamamoto, S., & Kitazawa, S. (2001). Reversal of subjective temporal order due to arm crossing. Nature Neuroscience, 4 (7), 759–765. doi:10.1038/89559

Development of the spatial coding of touch: ability vs. automaticity.

Development of the spatial coding of touch: ability vs. automaticity. - PDF Download Free
83KB Sizes 2 Downloads 4 Views