NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Murray MM, Wallace MT, editors. The Neural Bases of Multisensory Processes. Boca Raton (FL): CRC Press/Taylor & Francis; 2012.

Cover of The Neural Bases of Multisensory Processes

The Neural Bases of Multisensory Processes.

Show details

Chapter 28The Body in a Multisensory World

and .

28.1. INTRODUCTION

It is our body through which we interact with the environment. We have a very clear sense about who we are in the sense that we know where our body ends, and what body parts we own. Above that, we usually are (or can easily become) aware of where each of our body parts is currently located, and most of our movements seem effortless, whether performed under conscious control or not. When we think about ourselves, we normally perceive our body as a stable entity. For example, when we go to bed, we do not expect that our body will be different when we wake up the next morning. Quite contrary to such introspective assessment, the brain has been found to be surprisingly flexible in updating its representation of the body. As an illustration, consider what happens when an arm or leg becomes numb after you have sat or slept in an unsuitable position for too long. Touching the numb foot feels very strange, as if you touch someone else’s foot. When you lift a numb hand with the other hand, it feels far too heavy. Somehow, it feels as if the limb does not belong to the own body.

Neuroscientists have long been fascinated with how the brain represents the body. It is usually assumed that there are several different types of body representations, but there is no consensus about what these representations are, or how many there may be (de Vignemont 2010; Gallagher 1986; Berlucchi and Aglioti 2010; see also Dijkerman and de Haan 2007 and commentaries thereof). The most common distinction is that between a body schema and a body image. The body schema is usually defined as a continuously updated sensorimotor map of the body that is important in the context of action, informing the brain about what parts belongs to the body, and where those parts are currently located (de Vignemont 2010). In contrast, the term body image is usually used to refer to perceptual, emotional, or conceptual knowledge about the body. However, other taxonomies have been proposed (see Berlucchi and Aglioti 2010; de Vignemont 2010), and the use of the terms body schema and body image has been inconsistent. This chapter will not present an exhaustive debate about these definitions, and we refer the interested reader to the articles cited above for detailed discussion; in this article, we will use the term body schema with the sensorimotor definition introduced above, referring to both aspects of what parts make up the body, and where they are located.

The focus of this chapter will be on the importance of multisensory processing for representing the body, as well as on the role of body representations for multisensory processing. On one hand, one can investigate how the body schema is constructed and represented in the brain, and Section 28.2 will illustrate that the body schema emerges from the interaction of multiple sensory modalities. For this very reason, one can, on the other hand, ask how multisensory interactions between the senses are influenced by the fact that the brain commands a body. Section 28.3, therefore, will present research on how the body schema is important in multisensory interactions, especially for spatial processing.

28.2. CONSTRUCTION OF BODY SCHEMA FROM MULTISENSORY INFORMATION

28.2.1. Representing Which Parts Make Up the Own Body

There is some evidence suggesting that an inventory of the normally existing body parts is genetically predetermined. Just like amputees, people born without arms and/or legs can have vivid sensations of the missing limbs, including the feeling of using them for gestural movements during conversation and for finger-aided counting (Ramachandran 1993; Ramachandran and Hirstein 1998; Brugger et al. 2000; Saadah and Melzack 1994; see also Lacroix et al. 1992). This phenomenon has therefore been termed phantom limbs. Whereas the existence of a phantom limb in amputees could be explained with the persistence of experience-induced representations of this limb after the amputation, such an explanation does not hold for congenital phantom limbs. In one person with congenital phantom limbs, transcranial magnetic stimulation (TMS) over primary motor, premotor, parietal, and primary sensory cortex evoked sensations and movements of the congenital phantom limbs (Brugger et al. 2000). This suggests that the information about which parts make up the own body is distributed across different areas of the brain.

There are not many reports of congenital phantoms in the literature, and so the phenomenon may be rare. However, the experience of phantom limbs after the loss of a limb, for example, due to amputation, is very common. It has been reported (Simmel 1962) that the probability of perceiving phantom limbs gradually increases with the age of limb loss from very young (2 of 10 children with amputations below the age of 2) to the age of 9 years and older (all of 60 cases), suggesting that developmental factors within this age interval may be crucial for the construction of the body schema (and, in turn, for the occurrence of phantom limbs).

The term “phantom limb” refers to limbs that would normally be present in a healthy person. In contrast, a striking impairment after brain damage, for example, to the basal ganglia (Halligan et al. 1993), the thalamus (Bakheit and Roundhill 2005), or the frontal lobe (McGonigle et al. 2002), is the report of one or more supernumerary limbs in addition to the normal limbs. The occurrence of a supernumerary limb is usually associated with the paralysis of the corresponding real limb, which is also attributable to the brain lesion. The supernumerary limb is vividly felt, and patients confabulate to rationalize why the additional limb is present (e.g., it was attached by the clinical staff during sleep), and why it is not visible (e.g., it was lost 20 years ago) (Halligan et al. 1993; Sellal et al. 1996; Bakheit and Roundhill 2005). It has therefore been suggested that the subjective presence of a supernumerary limb may result from cognitive conflicts between different pieces of sensory information (e.g., visual vs. proprioceptive) or fluctuations in the awareness about the paralysis, which in turn may be resolved by assuming the existence of two (or more) limbs rather than one (Halligan et al. 1993; Ramachandran and Hirstein 1998).

Whereas a patient with a phantom or a supernumerary limb perceives more limbs than he actually owns, some brain lesions result in the opposite phenomenon of patients denying the ownership of an existing limb. This impairment, termed somatoparaphrenia, has been reported to occur after temporo-parietal (Halligan et al. 1995) or thalamic-temporo-parietal damage (Daprati et al. 2000)—notably all involving the parietal lobe, which is thought to mediate multisensory integration for motor planning. Somatoparaphrenia is usually observed in conjunction with hemineglect and limb paralysis (Cutting 1978; Halligan et al. 1995; Daprati et al. 2000) and has been suggested to reflect a disorder of body awareness due to the abnormal sensorimotor feedback for the (paralyzed) limb after brain damage (Daprati et al. 2000).

Lesions can also affect the representation of the body and self as a whole, rather than just affecting single body parts. These experiences have been categorized into three distinct phenomena (Blanke and Metzinger 2009). During out-of-body experiences, a person feels to be located outside of her real body and to look at herself, often from above. In contrast, during an autoscopic illusion, the person localizes herself in her real body, but sees an illusory body in extrapersonal space (e.g., in front of herself). Finally, during heautoscopy, a person sees a second body and feels to be located in both, either at the same time, or in sometimes rapid alternation. In patients, such illusions have been suggested to be related to damage to the temporo-parietal junction (TPJ) (Blanke et al. 2004), and an out-of-body experience was elicited by stimulation of an electrode implanted over the TPJ for presurgical assessment (Blanke et al. 2002). Interestingly, whole body illusions can coincide with erroneous visual perceptions about body parts, for example, an impression of limb shortening or illusory flexion of an arm. It has therefore been suggested that whole body illusions are directly related to the body schema, resulting from a failure to integrate multisensory (e.g., vestibular and visual) information about the body and its parts, similar to the proposed causes of supernumerary limbs (Blanke et al. 2004).

In sum, many brain regions are involved in representing the configuration of the body; some aspects of these representations seem to be innate, and are probably refined during early development. Damage to some of the involved brain regions can lead to striking modifications of the perceived body configuration, as well as to illusions about the whole body.

28.2.2. Multisensory Integration for Limb and Body Ownership

Although the previous section suggests that some aspects of the body schema may be hardwired, the example of the sleeping foot with which we started this chapter suggests that the body schema is a more flexible representation. Such fast changes of the body’s representation have been demonstrated with an ingenious experimental approach: to mislead the brain as to the status of ownership of a new object and to provoke its inclusion into the body schema. This trick can be achieved by using rubber hands: a rubber hand is placed in front of a participant in such a way that it could belong to her own body, and it is then stroked in parallel with the participant’s real, hidden hand. Most participants report that they feel the stroking at the location of the rubber hand, and that they feel as if the rubber hand were their own (Botvinick and Cohen 1998). One of the main determinants for this illusion to arise is the synchrony of the visual and tactile stimulation. In other words, the touches felt at the own hand and those seen to be delivered to the rubber hand must match. It might in fact be possible to trick the brain into integrating other than handlike objects into its body schema using this synchronous stroking technique: when the experimenter stroked not a rubber hand but a shoe placed on the table (Ramachandran and Hirstein 1998) or even the table surface (Armel and Ramachandran 2003), participants reported that they “felt” the touch delivered to their real hand to originate from the shoe and the table. Similarly, early event-related potentials (ERPs) in response to tactile stimuli were enhanced after synchronous stimulation of a rubber hand as well as of a non-hand object (Press et al. 2008). Even more surprisingly, participants in Armel and Ramachandran’s study displayed signs of distress and an increased skin conductance response when the shoe was hit with a hammer, or a band-aid was ripped off the table surface. Similar results, that is, signs of distress, were also observed when the needle of a syringe was stabbed into the rubber hand, and these behavioral responses were associated with brain activity in anxiety-related brain areas (Ehrsson et al. 2007). Thus, the mere synchrony of visual events at an object with the tactile sensations felt at the hand seem to have led to some form of integration of the objects (the rubber hand, the shoe, or the table surface) into the body schema, resulting in physiological and emotional responses usually reserved for the real body. It is important to understand that participants in the rubber hand illusion (RHI) do not feel additional limbs; rather, they feel a displacement of their own limb, which is reflected behaviorally by reaching errors after the illusion has manifested itself (Botvinick and Cohen 1998; Holmes et al. 2006; but see Kammers et al. 2009a, 2009c, and discussion in de Vignemont 2010), and by an adjustment of grip aperture when finger posture has been manipulated during the RHI (Kammers et al. 2009b). Thus, a new object (the rubber hand) is integrated into the body schema, but is interpreted as an already existing part (the own but hidden arm).

The subjective feeling of ownership of a rubber hand has also been investigated using functional magnetic resonance imaging (fMRI). Activity emerged in the ventral premotor cortex and (although, statistically, with only a tendency for significance) in the superior parietal lobule (SPL) (Ehrsson et al. 2004). In the monkey, both of these areas respond to peripersonal stimuli around the hand and head. Activity related to multisensory integration—synchrony of tactile and visual events, as well as the alignment of visual and proprioceptive information about arm posture—was observed in the SPL, presumably in the human homologue of an area in the monkey concerned with arm reaching [the medial intraparietal (MIP) area]. Before the onset of the illusion, that is, during its buildup, activity was seen in the intraparietal sulcus (IPS), in the dorsal premotor cortex (PMd), and in the supplementary motor area (SMA), which are all thought to be part of an arm-reaching circuit in both monkeys and humans. Because the rubber arm is interpreted as one’s own arm, the illusion may be based on a recalibration of perceived limb position, mediated parietally, according to the visual information about the rubber arm (Ehrsson et al. 2004; Kammers et al. 2009c). As such, the integration of current multisensory information about the alleged position of the hand must be integrated with long-term knowledge about body structure (i.e., the fact that there is a hand to be located) (de Vignemont 2010; Tsakiris 2010).

Yet, as noted earlier, an integration of a non-body-like object also seems possible in some cases. Besides the illusory integration of a shoe or the table surface due to synchronous stimulation, an association of objects with the body has been reported in a clinical case of a brain-lesioned patient who denied ownership of her arm and hand; when she wore the wedding ring on that hand, she did not recognize it as her own. When it was taken off the neglected hand, the patient immediately recognized the ring as her own (Aglioti et al. 1996). Such findings might therefore indicate an involvement of higher cognitive processes in the construction of the body schema.

It was mentioned in the previous section that brain damage can lead to misinterpretations of single limbs (say, an arm or a leg), but also of the whole body. Similarly, the rubber hand paradigm has been modified to study also the processes involved in the perception of the body as a whole and of the feeling of self. Participants viewed a video image of themselves filmed from the back (Ehrsson 2007) or a virtual reality character at some distance in front of them (Lenggenhager et al. 2007). They could see the back of the figure in front of them being stroked in synchrony with feeling their own back being stroked. This manipulation resulted in the feeling of the self being located outside the own body and of looking at oneself (Ehrsson 2007). Furthermore, when participants were displaced from their viewing position and asked to walk to the location at which they felt “themselves” during the illusion, they placed themselves in between the real and the virtual body’s locations (Lenggenhager et al. 2007). Although both rubber hand and whole body illusions use the same kind of multisensory manipulation, the two phenomena have been proposed to tap into different aspects of body processing (Blanke and Metzinger 2009): whereas the rubber hand illusion leads to the attribution of an object into the body schema, the whole body illusion manipulates the location of a global “self” (Blanke and Metzinger 2009; Metzinger 2009), and accordingly the first-person perspective (Ehrsson 2007). This distinction notwithstanding, both illusions convincingly demonstrate how the representation of the body in the brain is determined by the integration of multisensory information.

To sum up, our brain uses the synchrony of multisensory (visual and tactile) stimulation to determine body posture. Presumably, because touch is necessarily located on the body, such synchronous visuo-tactile stimulation can lead to illusions about external objects to belong to our body, and even to mislocalization of the location of the whole body. However, the illusion is not of a new body part having been added, but rather of a non-body object taking the place of an already existing body part (or, in the case of the whole body illusion, the video image indicating our body’s location).

28.2.3. Extending the Body: Tool Use

At first sight, the flexibility of the body schema demonstrated with the rubber hand illusion and the whole body illusion may seem impedimental rather than useful. However, a very common situation in which such integration may be very useful is the use of tools. Humans, and to some extent also monkeys, use tools to complement and extend the abilities and capacity of their own body parts to act upon their environment. In this situation, visual events at the tip of the tool (or, more generally, at the part of the tool used to manipulate the environment) coincides with tactile information received at the hand—a constellation that is very similar to the synchronous stroking of a non-body object and a person’s hand. Indeed, some neurons in the intraparietal part of area PE (PEip) of monkeys respond to tactile stimuli to the hand, as well as to visual stimuli around the tactile location (see also Section 28.3.2). When the monkey was trained to use a tool to retrieve otherwise unreachable food, the visual receptive fields (RFs), which encompassed only the hand when no tool was used, now encompassed both the hand and the tool (Iriki et al. 1996). In a similar manner, when the monkey learned to observe his hand in a monitor rather than seeing it directly, the visual RFs now encompassed the monitor hand (Obayashi et al. 2000). These studies have received some methodological criticism (Holmes and Spence 2004), but their results are often interpreted as some form of integration of the tool into the monkey’s body schema. Neurons with such RF characteristics might therefore be involved in the mediation of rapid body schema modulations illustrated by the rubber hand illusion in humans. Although these monkey findings are an important step toward understanding tool use and its relation to the body schema, it is important to note that the mechanisms discovered in the IPS cannot explain all phenomena involved either in tool use or in ownership illusions. For example, it has been pointed out that a tool does not usually feel as an own body part, even when it is frequently used, as for example, a fork (Botvinick 2004). Such true ownership feelings may rather be restricted to body part–shaped objects such as a prosthesis or a rubber hand, given that they are located in an anatomically plausible location (Graziano and Gandhi 2000; Pavani et al. 2000). For the majority of tools, one might rather feel that the sensation of a touch is projected to the action-related part of the tool (usually the tip), such as one may feel the touch of the pen to occur between the paper and the pen tip, and not at the fingers holding the pen (see also Yamamoto and Kitazawa 2001b; Yamamoto et al. 2005). Accordingly, rather than the tool being integrated into the body schema, it may be that tool use results in the directing of attention toward that part in space that is relevant for the currently performed action. Supporting such interpretations, it has recently been shown that visual attention was enhanced at the movement endpoint of the tool as well as at the movement endpoint of the hand when a reach was planned with a tool. Attention was not enhanced, however, in between those locations along the tool (Collins et al. 2008). Similarly, cross-modal (visual-tactile) interactions have been shown to be enhanced at the tool tip and at the hand, but not in locations along the tool (Holmes et al. 2004; Yue et al. 2009). Finally, in a recent study, participants were asked to make tactile discrimination judgments about stimuli presented to the tip of a tool. Visual distractors were presented in parallel to the tactile stimuli. fMRI activity in response to the visual distractors near the end of the tool was enhanced in the occipital cortex, compared to locations further away from the tool (Holmes et al. 2008). These findings were also interpreted to indicate an increase of attention at the tool tip, due to the use of the tool.

Experimental results such as these challenge the idea of an extension of the body schema. Other results, in contrast, do corroborate the hypothesis of an extension of the body schema due to tool use. For example, tool use resulted in a change of the perceived distance between two touches to the arm, which was interpreted to indicate an elongated representation of the arm (Cardinali et al. 2009b).

It has recently been pointed out that the rubber hand illusion seems to consist of several dissociable aspects (Longo et al. 2008), revealed by the factor-analytic analysis of questionnaire related to the experience of the rubber hand illusion. More specific distinctions may need to be made about the different processes (and, as a consequence, the different effects found in experiments) involved in the construction of the body schema, and different experimental paradigms may tap into only a subset of these processes.

In sum, multisensory signals are not only important for determining what parts we perceive to be made of. Multisensory mechanisms are also important in mediating the ability to use tools. It is currently under debate if tools extend the body schema by integrating the tool as a body part, or if other multisensory processes, for example, a deployment of attention to the space manipulated by the tool, are at the core of our ability to use tools.

28.2.4. Rapid Plasticity of Body Shape

The rubber hand illusion demonstrates that what the brain interprets as the own body can be rapidly adjusted to the information that is received from the senses. Rapid changes of the body schema are, however, not restricted to the inventory of body parts considered to belong to the body, or their current posture. They also extend to the body’s shape. We already mentioned that the representation of the arm may be prolonged after tool use (Cardinali et al. 2009b). An experience most of us have had is the feeling of an increased size of an anesthesized body part, for example, the lip during a dentist’s appointment (see also Türker et al. 2005; Paqueron et al. 2003). Somewhat more spectacular, when a participant holds the tip of his nose with his thumb and index finger while his biceps muscle is vibrated to induce the illusion of the arm moving away from the body, many participants report that they perceive their nose to elongate to a length of up to 30 cm (sometimes referred to as the Pinocchio illusion; Lackner 1988). A related illusion can be evoked when an experimenter leads the finger of a participant to irregularly tap the nose of a second person (seated next to the participant), while he synchronously taps the participant’s own nose (Ramachandran and Hirstein 1997; see also discussion in Ramachandran and Hirstein 1998). Both illusions are induced by presenting the brain with mismatching information about touch and proprioception. They demonstrate that, despite the fact that our life experience would seem to preclude sudden elongations of the nose (or any other body part, for that matter), the body schema is readily adapted when sensory information from different modalities (here, tactile and proprioceptive) calls for an integration of initially mismatching content.

The rubber hand illusion has been used also to investigate effects of the perception of body part size. Participants judged the size of a coin to be bigger when the illusion was elicited with a rubber hand bigger than their own, and to be smaller when the rubber hand was smaller (Bruno and Bertamini 2010). The rubber hand illusion thus influenced tactile object perception. This influence was systematic: as the real object held by the participants was always the same size, their finger posture was identical in all conditions. With the illusion of a small hand, this posture would indicate a relatively small distance between the small fingers. In contrast, with the illusion of a big hand, the same posture would indicate a larger distance between the large fingers.

Similarly, visually perceived hand size has also been shown to affect grip size, although more so when the visual image of the hand (a projection of an online video recording of the hand) was bigger than normal (Marino et al. 2010).

The rubber hand illusion has also been used to create the impression of having an elongated arm by having participants wear a shirt with an elongated sleeve from which the rubber hand protruded (Schaefer et al. 2007). By recording magnetoencephalographic (MEG) responses to tactile stimuli to the illusion hand, this study also demonstrated an involvement of primary somatosensory cortex in the illusion.

These experiments demonstrate that perception of the body can be rapidly adjusted by the brain, and that these perceptual changes in body shape affect object perception as well as hand actions.

28.2.5. Movement and Posture Information in the Brain

The rubber hand illusion shows how intimately body part ownership and body posture are related: in this illusion, an object is felt to belong to the own body, but at the same time, posture of the real arm is felt to be at the location of the rubber arm. In the same way, posture is, of course, intimately related to movement, as every movement leads to a change in posture. However, different brain areas seem to be responsible for perceiving movement and posture.

The perception of limb movement seems to depend on the primary sensory and motor cortex as well as on the premotor and supplementary motor cortex (reviewed by Naito 2004). This is true also for the illusory movement of phantom limbs, which is felt as real movement (Bestmann et al. 2006; Lotze et al. 2001; Roux et al. 2003; Brugger et al. 2000). However, the primary motor cortex may play a crucial role in movement perception. One can create an illusion of movement by vibration of the muscles responsible for the movement of a body part, for example, the arm or hand. When a movement illusion is created for one hand, then this illusion transfers to the other hand if the palms of the two hands touch. For both hands, fMRI activity increased in primary motor cortex, suggesting a primary role of this motor-related structure also for the sensation of movement (Naito et al. 2002).

In contrast, the current body posture seems to be represented quite differently from limb movement perception. Proprioceptive information arrives in the cortex via the somatosensory cortex. Accordingly, neuronal responses in secondary somatosensory cortex (SII) to tactile stimuli to a monkeys hand were shown to be modulated by the monkey’s arm posture (Fitzgerald et al. 2004). In humans, the proprioceptive drift associated with the rubber hand illusion—that is, the change of the subjective position of the own hand toward the location of the rubber hand—was correlated with activity in SII acquired with PET (Tsakiris et al. 2007). (SII) was also implicated in body schema functions by a study in which participants determined the laterality of an arm seen on a screen by imagining to turn their own arm until it matched the seen one, as compared to when they determined the onscreen arm’s laterality by imagining its movement toward the appropriate location on a body that was also presented on the screen (Corradi-Dell’Acqua et al. 2009). SII was thus active during the imagination of specifically one’s own posture when making a postural judgment.

However, many other findings implicate hierarchically higher, more posterior parietal areas in the maintenance of a posture representation. When participants were asked to reach with their hand to another body part, activity increased in the SPL after a posture change as compared to when participants repeated a movement they had just executed before. This posture change effect was observed both when the reaching hand changed its posture, as well as when participants reached with one hand to the other, and the target hand rather than the reaching hand changed its posture (Pellijeff et al. 2006). Although the authors interpreted their results as reflecting postural updating, they may instead be attributable to reach planning. However, a patient with an SPL lesion displayed symptoms that corroborate the view that the SPL is involved in the maintenance of a continuous postural model of the body (Harris and Wolpert 1998). This patient complained that her arm and leg felt like they drifted and then faded, unless she could see them. This subjective feeling was accompanied by an inability to retain grip force as well as a loss of tactile perception of a vibratory stimulus after it was displayed for several seconds. Because the patient’s deficit was not a general disability to detect tactile stimulation or perform hand actions, these results seem to imply that it is the maintenance of the current postural state of the body that was lost over time unless new visual, tactile, or proprioceptive information forced an update of the model. The importance of the SPL for posture control is also evident from a patient who, after SPL damage, lost her ability to correctly interact with objects requiring whole body coordination, such as sitting on a chair (Kase et al. 1977). Still further evidence for an involvement of the SPL in posture representation comes from experiments in healthy participants. When people are asked to judge the laterality of a hand presented in a picture, these judgments are influenced by the current hand posture adopted by the participant: the more unnatural it would be to align the own hand with the displayed hand, the longer participants take to respond (Parsons 1987; Ionta et al. 2007). A hand posture change during the hand laterality task led to an activation in the SPL in fMRI (de Lange et al. 2006). Hand crossing also led to a change in intraparietal activation during passive tactile stimulation (Lloyd et al. 2003). Finally, recall that fMRI activity during the buildup of the rubber hand illusion, thought to involve postural recalibration due to the visual information about the rubber arm, was also observed in the SPL.

These findings are consistent with data from neurophysiological recordings in monkeys showing that neurons in area 5 (Sakata et al. 1973) in the superior parietal lobe as well as neurons in area PEc (located just at the upper border of the IPS and extending into the sulcus to border MIP; Breveglieri et al. 2008) respond to complex body postures, partly involving several limbs. Neurons in these areas respond to tactile, proprioceptive, and visual input (Breveglieri et al. 2008; Graziano et al. 2000). Furthermore, some area 5 neurons fire most when the felt and the seen position of the arm correspond rather than when they do not (Graziano 1999; Graziano et al. 2000). These neurons respond not only to vision of the own arm, but also to vision of a fake arm, if it is positioned in an anatomically plausible way such that they look as if they might belong to the animal’s own body, reminiscent of the rubber hand illusion in humans. Importantly, some neurons fire most when the visual information of the fake arm matches the arm posture of the monkey’s real, hidden arm, but reduce their firing rate when vision and proprioception do not match.

To summarize, body movement and body posture are represented by different brain regions. Movement perception relies on the motor structures of the frontal lobe. Probably, the most important brain region for the representation of body posture, in contrast, is the SPL. This region is known to integrate signals from different sensory modalities, and damage to it results in dysfunctions of posture perception and actions requiring postural adaptations. However, other brain regions are involved in posture processing as well.

28.2.6. The Body Schema: A Distributed versus Holistic Representation

The evidence reviewed so far has shown that what has been subsumed under the term body schema is not represented as one single, unitary entity in the brain—even if, from a psychological standpoint, it would seem to constitute an easily graspable and logically coherent concept. However, as has often proved to be the case in psychology and in the neurosciences, what researchers have hypothesized to be functional entities for the brain’s organization is not necessarily the way nature has indeed evolved the brain. The organization of the parietal and frontal areas seems to be modular, and they appear to be specialized for certain body parts (Rizzolatti et al. 1998; Grefkes and Fink 2005; Andersen and Cui 2009), for example, for hand grasping, arm reaching, and eye movements. Similarly, at least in parts of the premotor cortex, RFs for the different sensory modalities are body part-centered (e.g., around the hand; see also Section 28.3.2), suggesting that, possibly, other body part-specific areas may feature coordinate frames anchored to those body parts (Holmes and Spence 2004). As a consequence, the holistic body schema that we subjectively experience has been proposed to emerge from the interaction of multiple space-, body-, and action-related brain areas (Holmes and Spence 2004).

28.2.7. Interim Summary

The first part of this chapter has highlighted how important the integration of multisensory information is for body processing. We showed that a representation of our body parts is probably innate, and that lesions to different brain structures such as the parietal and frontal lobes as well as subcortical structures can lead to malfunctions of this representation. Patients can perceive lost limbs as still present, report additional limbs to the normal ones, and deny the ownership of a limb. We went on to show how the integration of multisensory (usually visual and tactile) information is used in an online modification or “construction” of the body schema. In the rubber hand illusion, synchronous multisensory information leads to the integration of an external object into the body schema in the sense that the location of the real limb is felt to be at the external object. Multisensory information can also lead to adjustments of perceived body shape, as in the Pinocchio illusion. Information about body parts—their movement and their posture—are represented in a widespread network in the brain. Whereas limb movement perception seems to rely on motor structures, multisensory parietal areas are especially important for the maintenance of a postural representation. Finally, we noted that the current concept of the body schema in the brain is that of an interaction between many body part-specific representations.

28.3. THE BODY AS A MODULATOR FOR MULTISENSORY PROCESSING

The first part of this chapter has focused on the multisensory nature of the body schema with its two aspects of what parts make up the body, and where those parts are located in space and in relation to one another. These studies form the basis for an exploration of the specific characteristics of body processing and its relevance for perception, action, and the connection of these two processes. The remainder of this article, therefore, will adopt the opposite view than the first part: it will assume the existence of a body schema and explore its influence on multisensory processing.

One of the challenges for multisensory processing is that information from the different senses is received by sensors that are arranged very differently from modality to modality. In vision, light originating from neighboring spatial locations falls on neighboring rods and cones on the retina. When the eyes move, light from the same spatial origin falls on different sensors on the retina. Visual information is therefore initially eye-centered. Touch is perceived through sensors all over the skin. Because the body parts constantly move in relation to each other, a touch to the same part of the skin can correspond to very different locations in external, visual space. Similar challenges arise for the spatial processing in audition, but we will focus here on vision and touch.

28.3.1. Recalibration of Sensory Signals and Optimal Integration

In some cases, the knowledge about body posture and movement is used to interpret sensory information. For example, Lackner and Shenker (1985) attached a light or a sound source to each hand of their participants who sat in a totally dark room. They then vibrated the biceps muscles of the two arms; recall that muscle vibration induces the illusion of limb movement. In this experimental setup, participants perceived an outward movement of the two arms. Both the lights and the sound were perceived as moving with the apparent location of the hands, although the sensory information on the retina and in the cochlea remained identical throughout these manipulations. Such experimental findings have lead to the proposal that the brain frequently recalibrates the different senses to ensure that the actions carried out with the limbs are in register with the external world (Lackner and DiZio 2000). The brain seems to use different sensory input to do this, depending on the experimental situation. In the rubber hand illusion, visual input about arm position apparently overrules proprioceptive information about the real position of the arm. In other situations, such as in the arm vibration illusion, proprioception can overrule vision.

Although winner-take-all schemes for such dominance of one sense over another have been proposed (e.g., Ramachandran and Hirstein 1998), there is ample evidence that inconsistencies in the information from the different senses does not simply lead to an overruling of one by the other. Rather, the brain seems to combine the different senses to come up with a statistically optimal estimate of the true environmental situation, allowing for statistically optimal movements (Körding and Wolpert 2004; Trommershäuser et al. 2003) as well as perceptual decisions (Ernst and Banks 2002; Alais and Burr 2004). Because in many cases one of our senses outperforms the others in a specific sensory ability—for example, spatial acuity is superior in vision (Alais and Burr 2004), and temporal acuity is best in audition (Shams et al. 2002; Hötting and Röder 2004)—many experimental results have been interpreted in favor of an “overrule” hypothesis. Nevertheless, it has been demonstrated, for example, in spatial tasks, that the weight the brain assigns to the information received through a sensory channel is directly related to its spatial acuity, and that audition (Alais and Burr 2004) and touch (Ernst and Banks 2002) will overrule vision when visual acuity is sufficiently degraded. Such integration is probably involved also in body processing and in such phenomena as the rubber hand and Pinocchio illusions.

In sum, the body schema influences how multisensory information is interpreted by the brain. The weight that a piece of multisensory information is given varies with its reliability (see also de Vignemont 2010).

28.3.2. Body Schema and Peripersonal Space

Many neurons in a brain circuit involving the ventral intraparietal (VIP) area and the ventral premotor cortex (PMv) feature tactile RFs, mainly around the monkey’s mouth, face, or hand. These tactile RF are supplemented by visual and sometimes auditory RFs that respond to the area up to ~30 cm around the body part (Fogassi et al. 1996; Rizzolatti et al. 1981a, 1981b; Graziano et al. 1994, 1999; Duhamel et al. 1998; Graziano and Cooke 2006). Importantly, when either the body part or the eyes are moved, the visual RF is adjusted online such that the tactile and the visual modality remain aligned within a given neuron (Graziano et al. 1994). When one of these neurons is electrically stimulated, the animal makes defensive movements (Graziano and Cooke 2006). Because of these unique RF properties, the selective part of space represented by this VIP-PMv circuit has been termed the peripersonal space, and it has been suggested to represent a defense zone around the body. Note that the continuous spatial adjustment of the visual to the tactile RF requires both body posture and eye position to be integrated in a continuous manner. Two points therefore become immediately clear: first, the peripersonal space and the body schema are intimately related (see also Cardinali et al. 2009a); and second, as the body schema, the representation of peripersonal space includes information from several (if not all) sensory modalities. As is the case with the term “body schema,” the term “peripersonal space” has also been defined in several ways. It is sometimes used to denote the space within arm’s reach (see, e.g., Previc 1998). For the purpose of this review, “peripersonal space” will be used to denote the space directly around the body, in accord with the findings in monkey neurophysiology.

Different approaches have been taken to investigate if peripersonal space is similarly represented in humans as in monkeys. One of them has been the study of patients suffering from extinction. These patients are usually able to report single stimuli in all spatial locations, but fail to detect contralesional stimuli when these are concurrently presented with ipsilesional stimuli (Ladavas 2002). The two stimuli can be presented in two different modalities (Ladavas et al. 1998), indicating that the process that is disrupted by extinction is multisensory in nature. More importantly, extinction is modulated in some patients by the distance of the distractor stimulus (i.e., the ipsilesional stimulus that extinguishes the contralesional stimulus) from the hand. For example, in some patients a tactile stimulus to the contralesional hand is extinguished by an ipsilesional visual stimulus to a much higher degree when it is presented in the peripersonal space of the patient’s ipsilesional hand than when it is presented far from it (di Pellegrino and Frassinetti 2000). Therefore, extinction is modulated by two manipulations that are central to neurons representing peripersonal space in monkeys: (1) extinction can be multisensory and (2) it can dissociate between peripersonal and extrapersonal space. In addition, locations of lesions associated with extinction coincide (at least coarsely) with the brain regions associated with peripersonal spatial functions in monkeys (Mort et al. 2003; Karnath et al. 2001). The study of extinction patients has therefore suggested that a circuit for peripersonal space exists in humans, analogously to the monkey.

The peripersonal space has also been investigated in healthy humans. One of the important characteristics of the way the brain represents peripersonal space is the alignment of visual and tactile events. In an fMRI study in which participants had to judge if a visual stimulus and a tactile stimulus to the hand were presented from the same side of space, hand crossing led to an increase of activation in the secondary visual cortex, indicating an influence of body posture on relatively low-level sensory processes (Misaki et al. 2002). In another study, hand posture was manipulated in relation to the eye: rather than changing hand posture itself, gaze was directed such that a tactile stimulus occurred either in the right or the left visual hemifield. The presentation of bimodal visual-tactile stimuli led to higher activation in the visual cortex in the hemisphere contralateral to the visual hemifield of the tactile location, indicating that the tactile location was remapped with respect to the visual space and then influenced visual cortex (Macaluso et al. 2002). These influences of posture and eye position on early sensory cortex may be mediated by parietal cortex. For example, visual stimuli were better detected when a tactile stimulus was concurrently presented (Bolognini and Maravita 2007). This facilitatory influence of the tactile stimulus was best when the hand was held near the visual stimulus, both when this implied a normal or a crossed hand posture. However, hand crossing had a very different effect when neural processing in the posterior parietal cortex was impaired by repetitive TMS: now a tactile stimulus was most effective when it was delivered to the hand anatomically belonging to that side of the body at which the visual stimulus was presented; when the hands were crossed, a right hand stimulus, for example, facilitated a right-side visual stimulus, although the hand was located in the left visual space (Bolognini and Maravita 2007). This result indicates that after disruption of parietal processing, body posture was no longer taken into account during the integration of vision and touch, nicely in line with the findings about the role of parietal cortex for posture processing (see Section 28.2.5).

A more direct investigation of how the brain determines if a stimulus is located in the peripersonal space was undertaken in an fMRI study that independently manipulated visual and proprioceptive cues about hand posture to modulate the perceived distance of a small visual object from the participants’ hand. Vision of the arm could be occluded, and the occluded arm was then located near the visual object (i.e., peripersonally) or far from it; the distance from the object could be determined by the brain only by using proprioceptive information. Alternatively, vision could be available to show that the hand was either close or far from the stimulus. Ingeniously, the authors manipulated these proprioceptive and visual factors together by using a rubber arm: when the real arm was held far away from the visual object, the rubber hand could be placed near the object so that visually the object was in peripersonal space (Makin et al. 2007). fMRI activity due to these manipulations was found in posterior parietal areas. There was some evidence that for the determination of posture in relation to the visual object, proprioceptive signals were more prominent in the anterior IPS close to the somatosensory cortex, and that vision was more prominent in more posterior IPS areas, closer to visual areas. Importantly, however, all of these activations were located in the SPL and IPS, the areas that have repeatedly been shown to be relevant for the representation of posture and of the body schema.

Besides these neuroimaging approaches, behavioral studies have also been successful in investigating the peripersonal space and the body schema. One task that has rendered a multitude of findings is a cross-modal interference paradigm, the cross-modal congruency (CC) task (reviewed by Spence et al. 2004b). In this task, participants receive a tactile stimulus to one of four locations; two of these locations are located “up” and two are located “down” (see Figure 28.1). Participants are asked to judge the elevation of the tactile stimulus in each trial, regardless of its side (left or right). However, a to-be-ignored visual distractor stimulus is presented with every tactile target stimulus, also located at one of the four locations at which the tactile stimuli can occur. The visual distractor is independent of the tactile target; it can therefore occur at a congruent location (tactile and visual stimulus have the same elevation) or at an incongruent location (tactile and visual stimulus have opposing elevations). Despite the instruction to ignore the visual distractors, participants’ reaction times and error probabilities are influenced by them. When the visual distractors are congruent, participants perform faster and with higher accuracy than when the distractors are incongruent. The difference of the incongruent minus the congruent conditions (e.g., in RT and in accuracy) is referred to as the CC effect. Importantly, the CC effect is larger when the distractors are located close to the stimulated hands rather than far away (Spence et al. 2004a). Moreover, the CC effect is larger when the distractors are placed near rubber hands, but only if those are positioned in front of the participant in such a way that, visually, they could belong to the participant’s body (Pavani et al. 2000). The CC effect is also modulated by tool use in a similar manner as by rubber hands; when a visual distractor is presented in far space, the CC effect is relatively small, but it increases when a tool is held near the distractor (Maravita et al. 2002; Maravita and Iriki 2004; Holmes et al. 2007). Finally, the CC effect is increased during the whole body illusion (induced by synchronous stroking; see Section 28.2.2) when the distractors are presented on the back of the video image felt to be the own body, compared to when participants see the same video image and distractor stimuli, but without the induction of the whole body illusion (Aspell et al. 2009). These findings indicate that cross-modal interaction, as indexed in the CC effect, is modulated by the distance of the distractors from what is currently represented as the own body (i.e., the body schema) and thus suggest that the CC effect arises in part from the processing of peripersonal space.

FIGURE 28.1. Standard cross-modal congruency task.

FIGURE 28.1

Standard cross-modal congruency task. Tactile stimuli are presented to two locations on the hand (often index finger and thumb holding a cube; here, back and palm of the hand). In each trial, one of the tactile stimuli is presented concurrently with one (more...)

To summarize, monkey physiology, neuropsychological findings, and behavioral research suggest that the brain specially represents the space close around the body, the peripersonal space. There is a close relationship between the body schema and the representation of peripersonal space, as body posture must be taken into account to remap, from moment to moment, which part of external space is peripersonal.

28.3.3. Peripersonal Space around Different Parts of the Body

All of this behavioral research—just as much as the largest part of all neurophysiological, neuropsychological, and neuroimaging research—has explored peripersonal space and the body schema using stimulation to and near the hands. The hands may be considered special in that they are used for almost any kind of action we perform. Processing principles revealed for the hands may therefore not generalize to other body parts. As an example, hand posture, but not foot posture, has been reported to influence the mental rotation of these limbs (Ionta et al. 2007; Ionta and Blanke 2009; but see Parsons 1987). Moreover, monkey work has demonstrated multisensory neurons with peripersonal spatial characteristics only for the head, hand, and torso, but neurons with equivalent characteristics for the lower body have so far not been reported (Graziano et al. 2002). The peripersonal space representation may thus be limited to body parts that are important for the manipulation of objects under (mainly) visual control. To test this hypothesis in humans, body schema–related effects such as the CC effect, which have been conducted for the hands, must be investigated for other body parts.

The aforementioned study of the CC effect during the whole body illusion (Aspell et al. 2009; see also Section 28.2.2) demonstrated a peripersonal spatial effect near the back. The CC effect was observable also when stimuli were delivered to the feet (Schicke et al. 2009), suggesting that a representation of the peripersonal space exists also for the space around these limbs. If the hypothesis is correct that the body schema is created from body part–specific representations, one might expect that the representation of the peripersonal space of the hand and that of the foot do not interact. To test this prediction, tactile stimuli were presented to the hands while visual distractors were flashed either near the participant’s real foot, near a fake foot, or far from both the hand and the foot. The cross-modal interference of the visual distractors, indexed by the CC effect, was larger when they were presented in the peripersonal space of the real foot than when they were presented near the fake foot or in extrapersonal space (Schicke et al. 2009). The spatial judgment of tactile stimuli at the hand was thus modulated when a visual distractor appeared in the peripersonal space of another body part. This effect cannot be explained with the current concept of peripersonal space as tactile RFs encompassed by visual RFs. These results rather imply either a holistic body schema representation, or, more probably, interactions beyond simple RF overlap between the peripersonal space representations of different body parts (Holmes and Spence 2004; Spence et al. 2004b).

In sum, the peripersonal space is represented not just for the hands, but also for other body parts. Interactions between the peripersonal spatial representations of different body parts challenge the concept of peripersonal space being represented merely by overlapping RFs.

28.3.4. Across-Limb Effects in Spatial Remapping of Touch

The fact that visual distractors in the CC paradigm have a higher influence when they are presented in the peripersonal space implies that the brain matches the location of the tactile stimulus with that of the visual one. The tactile stimulus is registered on the skin; to match this skin location to the location of the visual stimulus requires that body posture be taken into account and the skin location be projected into an external spatial reference frame. Alternatively, the visual location of the distractor could be computed with regard to the current location of the tactile location, that is, with respect to the hand, and thus be viewed as a projection of external onto the somatotopic space (i.e., the skin).

This remapping of visual–tactile space has been more thoroughly explored by manipulating hand posture. As in the standard CC task described earlier, stimuli were presented to the two hands and the distractors were placed near the tactile stimuli (Spence et al. 2004a). However, in half of the trials, participants crossed their hands. If spatial remapping occurs in this task, then the CC effect should be high whenever the visual distractor is located near the stimulated hand. In contrast, if tactile stimuli were not remapped into external space, then a tactile stimulus on the right hand should always be influenced most by a right-hemifield visual stimulus, independent of body posture. The results were clear-cut: when the hands were crossed, the distractors that were now near the hand were most effective. In fact, in this experiment the CC effect pattern of left and right distractor stimuli completely reversed, which the authors interpreted as a “complete remapping of visuotactile space” (p. 162).

Spatial remapping could thus be viewed as a means of integrating spatial information from the different senses in multisensory contexts. However, spatial remapping has also been observed in purely tactile tasks that do not involve any distractor stimuli of a second modality. One example is the temporal order judgment (TOJ) task, in which participants judge which of two tactile stimuli occurred first. Performance in this task is impaired when participants cross their hands (Yamamoto and Kitazawa 2001a; Shore et al. 2002; Röder et al. 2004; Schicke and Röder 2006; Azanon and Soto-Faraco 2007). It is usually assumed that the performance deficit after hand crossing in the TOJ task is due to a conflict between two concurrently active reference frames: one anatomical and one external (Yamamoto and Kitazawa 2001a; Röder et al. 2004; Schicke and Röder 2006). The right–left coordinate axes of these two reference frames are opposed to each other when the hands are crossed; for example, the anatomically right arm is located in the externally left hemispace during hand crossing. This remapping takes place despite the task being purely tactile, and despite the detrimental effect of using the external reference frame in the task. Remapping of stimulus location by accounting for current body posture therefore seems to be an automatically evoked process in the tactile system.

In the typical TOJ task, the two stimuli are applied to the two hands. It would therefore be possible that the crossing effect is simply due to a confusion regarding the two homologous limbs, rather than to the spatial location of the stimuli. This may be due to a coactivation of homologous brain areas in the two hemispheres (e.g., in SI or SII), which may make it difficult to assign the two concurrent tactile percepts to their corresponding visual spatial locations. However, a TOJ crossing effect was found for tactile stimuli delivered to the two hands, to the two feet, or to one hand and the contralateral foot (Schicke and Röder 2006). In other words, participants were confused not only about which of the two hands or the two feet was stimulated first, but they were equally impaired in deciding if it was a hand or a foot that received the first stimulus. Therefore, the tactile location originating on the body surface seems to be remapped into a more abstract spatial code for which the original skin location, and the somatotopic coding of primary sensory cortex, is no longer a dominating feature. In fact, it has been suggested that the location of a tactile stimulus on the body may be reconstructed by determining which body part currently occupies the part of space at which the tactile stimulus has been sensed (Kitazawa 2002). The externally anchored reference frame is activated in parallel with a somatotopic one, and their concurrent activation leads to the observed behavioral impairment.

To summarize, remapping of stimulus location in a multisensory experiment such as the CC paradigm is a necessity for aligning signals from different modalities. Yet, even when stimuli are purely unimodal, and the task would not require a recoding of tactile location into an external coordinate frame, such a transformation nonetheless seems to take place. Thus, even for purely tactile processing, posture information (e.g., proprioceptive and visual) is automatically integrated.

28.3.5. Is the External Reference Frame a Visual One?

The representation of several reference frames is, of course, not unique to the TOJ crossing effect. In monkeys, the parallel existence of multiple reference frames has been demonstrated in the different subareas of the IPS, for example, in VIP (Schlack et al. 2005), which is involved in the representation of peripersonal space, in MIP, which is involved in arm reaching (Batista et al. 1999), and in LIP, which is engaged in saccade planning (Stricanne et al. 1996). Somewhat counterintuitively, many neurons in these areas do not represent space in a reference frame that can be assigned to one of the sensory systems (e.g., a retinotopic one for vision, a head-centered one for audition) or a specific limb (e.g., a hand-centered reference frame for hand reach planning). Rather, there are numerous intermediate coding schemes present in the different neurons (Mullette-Gillman et al. 2005; Schlack et al. 2005). However, such intermediate coding has been shown to enable the transformation of spatial codes between different reference frames, possibly even in different directions, for example, from somatotopic to eye-centered and vice versa (Avillac et al. 2005; Pouget et al. 2002; Cohen and Andersen 2002; Xing and Andersen 2000). Similar intermediate coding has been found in posture-related area 5, which codes hand position in an intermediate manner between eye- and hand-centered coordinates (Buneo et al. 2002). Further downstream, in some parts of MIP, arm reaching coordinates may, in contrast, be represented fully in eye-centered coordinates, independent of whether the sensory target for reaching is visual (Batista et al. 1999; Scherberger et al. 2003; Pesaran et al. 2006) or auditory (Cohen and Andersen 2000). In addition to these results from monkeys, an fMRI experiment in humans has suggested common spatial processing of visual and tactile targets for saccade as well as for reach planning (Macaluso et al. 2007). Still further downstream, in the motor-related PMv, which has been proposed to form the peripersonal space circuit together with VIP, visual RFs are aligned with hand position (Graziano and Cooke 2006).

These findings have led to the suggestion that the external reference frame involved in tactile localization is a visual one, and that remapping occurs automatically to aid the fusion of spatial information of the different senses. Such use of visual coordinates may be helpful not only for action planning (e.g., the reach of the hand toward an object), but also for an efficient online correction of motor error with respect to the visual target (Buneo et al. 2002; Batista et al. 1999).

A number of variants of the TOJ paradigm have been employed to study the visual origin of the external reference frame in humans. For example, the crossing effect could be ameliorated when participants viewed uncrossed rubber hands (with their real hands hidden), indicating that visual (and not just proprioceptive) cues modulate spatial remapping (Azanon and Soto-Faraco 2007). In the same vein, congenitally blind people did not display a TOJ crossing effect, suggesting that they do not by default activate an external reference frame for tactile localization (Röder et al. 2004). Congenitally blind people also outperformed sighted participants when the use of an anatomically anchored reference frame was advantageous to solve a task, whereas they performed worse than the sighted when an external reference frame was better suited to solve a task (Röder et al. 2007). Importantly, people who turned blind later in life were influenced by an external reference frame in the same manner as sighted participants, indicating that spatial remapping develops during ontogeny when the visual system is available, and that the lack of automatic coordinate transformations into an external reference frame is not simply an unspecific effect of long-term visual deprivation (Röder et al. 2004, 2007). In conclusion, the use of an external reference frame seems to be induced by the visual system, and this suggests that the external coordinates used in the remapping of sensory information are visual coordinates.

Children did not show a TOJ crossing effect before the age of ~5½ years (Pagel et al. 2009). This late use of external coordinates suggests that spatial remapping requires a high amount of learning and visual–tactile experience during interaction with the environment. One might therefore expect remapping to take place only in regions that are accessible to vision. In the TOJ paradigm, one would thus expect a crossing effect when the hands are held in front, but no such crossing effect when the hands are held behind the back (as, because of the lack of tactile–visual experience in that part of space, no visual-tactile remapping should take place). At odds with these predictions, Kobor and colleagues (2006) observed a TOJ crossing effect (although somewhat reduced) also behind the back. We conducted the same experiment in our laboratory, and found that the size of the crossing effect did not differ in the front and in the back [previously unpublished data; n = 11 young, healthy, blindfolded adults, just noticeable difference (JND) for correct stimulus order: front uncrossed: 66 ± 10 ms; uncrossed back: 67 ± 11 ms; crossed front: 143 ± 39 ms; crossed back: 138 ± 25 ms; ANOVA main effect of part of space and interaction of hand crossing with hand of space, both F1,10 <1]. Because we must assume only minimal visual–tactile experience for the space behind our body, the results of these two experiments do not support the idea that the external coordinate system in tactile remapping is purely visual.

It is possible that the brain uses an action-based reference frame that represents the environment (or action target location) in external coordinates, which can be used to orient not only the eye, but also gaze, trunk, or the whole body, rather than simply visual (i.e., eye- or retina-centered) coordinates. In other words, the external reference frame may be anchored to the eyes for that part of space that is currently accessible to the eyes, but may be related to head, trunk, or body movement parameters for those parts of space currently out of view. Such a coordinate system would benefit from using external coordinates, because eye-, head-, and possibly body or trunk position must all be fused to allow the directing of the eyes (and, with them, usually the focus of attention) onto an externally located target.

Such an action-related reference frame seems plausible for several reasons. At least eye and head orienting (together referred to as gaze orienting) are both mediated by the superior colliculus (SC) (Walton et al. 2007), a brain structure that is important for multisensory processing (Stein et al. 1995; Freedman and Sparks 1997; Stuphorn et al. 2000) and that is connected to the IPS (Pare and Wurtz 1997, 2001). IPS as well as PMv (also connected to IPS) encode reaching (i.e., action) targets also in the dark, that is, in the absence of visual information (Fattori et al. 2005; Graziano et al. 1997). Moreover, a recent fMRI study demonstrated activation of the frontal eye fields—a structure thought to be involved in saccadic eye movements and visual attention—to sounds behind the head (Tark and Curtis 2009), which would suggest either a representation of unseen space (Tark and Curtis 2009) or, alternatively, the representation of a target coordinate in “action space” rather than in eye-centered space.

For locations that one can orient toward with an eye–head movement, an action-based reference frame could be identical to a visual reference frame and use eye- or gaze-centered coordinates, in line with eye-centered coding of saccade as well as hand reach targets in LIP and MIP. The monkey’s head is usually fixed during single-cell recording experiments, making it impossible to differentiate between eye-centered and gaze-centered (let alone trunk- or body-centered target) coding. In addition, the spatial coding of reach targets that are out of view (but not in the dark) has, to our knowledge, not been investigated.

To sum up, many electrophysiological and behavioral studies have suggested that touch is remapped into visual coordinates, presumably to permit its integration with information from other modalities. Remapping refers to a recoding of the location of a tactile event on the skin onto its external-spatial coordinates; in other words, remapping accounts for body posture when matching visual and tactile spatial locations. Because of the influence of the visual system during ontogeny (and, therefore, not in the congenitally blind), remapping occurs even for unimodal tactile events. Yet, the external reference frame may be “more than” visual, subserving orienting actions also to parts of space outside the current visual field.

28.3.6. Investigating the Body Schema and Reference Frames with Electrophysiology

Most of the evidence for coordinate transformations in humans discussed in this chapter so far has used behavioral measures. Electrophysiological measures (e.g., ERPs) offer an additional approach to investigate these processes. ERPs record electrical brain signals with millisecond resolution and therefore allow a very detailed investigation of functional brain activity. One fruitful approach is the manipulation of the attentional processing of sensory stimuli: it is known that the ERP is enhanced when a stimulus is presented at a location the person is currently attending to. In fact, there have been reports about the effect of hand crossing on the attentional processing of tactile stimuli delivered to the two hands. When a tactile stimulus is delivered to a hand while participants direct their attention to that hand, ERP deflections in the time range of 80–150 ms as well as between 200 and 300 ms are enhanced compared to when the same stimuli are delivered to the same hand while it is not attended. However, when participants crossed their hands, early attentional effects disappeared, and later effects were significantly reduced (Eimer et al. 2001, 2003).

These ERP results imply that tactile spatial attentional processes do not rely on an anatomical reference frame alone, as posture should otherwise have had no influence on attention-related ERP effects. A disadvantage of this experimental design is the only coarse differentiation between attended and unattended stimulation to determine the influence of reference frames: the lack of difference between attended and unattended conditions after hand crossing may be due to mere confusion of the two hands, effectively preventing selective direction of attention to one hand. Alternatively, it may be due to attentional enhancement to one hand in a somatotopic, and to the other hand in an external reference frame.

However, the difference in ERP magnitude between attended and unattended spatial locations is not binary. Rather, ERPs gradually decrease with the distance at which a stimulus is presented from the attended location, a phenomenon termed the spatial attentional gradient (Mangun and Hillyard 1988). The spatial gradient can be exploited to test more thoroughly if the ERP effects due to hand crossing are attributable to hand confusion, and to investigate if coordinate transformations are calculated for body parts other than the hands. To this end, participants were asked to attend to one of their feet, while tactile stimuli were presented to both hands and feet in random stimulus order. The hands were placed near the feet. Crucially, in some blocks each hand lay near its ipsilateral foot, whereas in some blocks, the hands were crossed so that each hand lay next to its contralateral foot. Thus, each hand could be near to or far from the attended location (one of the feet). The external spatial distance of each hand to the attended foot reversed with hand crossing, whereas of course the anatomical distance from each hand to the attended foot remained identical in both uncrossed and crossed conditions. Investigating the spatial gradient in ERPs to hand and foot stimuli thus made it possible to investigate whether the tactile system defines spatial distance in somatotopic or in external coordinates.

In the time interval 100–140 ms after stimulus presentation, ERPs of unattended tactile hand stimuli were more similar to the ERP of an attended hand stimulus when the hands were located close to the attended foot than when they were located far away, demonstrating that tactile attention uses an external reference frame (Heed and Röder 2010; see Figure 28.2). At the same time, ERPs were also influenced by the anatomical distance between the attended and the stimulated locations. ERPs to unattended hand stimuli were more similar to the ERP of an attended hand stimulus when the ipsilateral rather than the contralateral foot was attended.

FIGURE 28.2. ERP results for hand stimulation.

FIGURE 28.2

ERP results for hand stimulation. Traces from a fronto-central electrode ipsilateral to stimulation. In the figures depicting the different conditions, the attended foot is indicated by a filled gray dot; the stimulated right hand is indicated by a gray (more...)

ERPs in this time range are thought to originate in the secondary somatosensory cortex (SII) (Frot and Mauguiere 1999; Eimer and Forster 2003). Recall that SII was implicated in the integration of a rubber hand, as indexed by the perceptual drift of the own hand toward the rubber hand (Tsakiris et al. 2007), as well as in making postural judgments (Corradi-Dell’Acqua et al. 2009). These findings thus converge with the ERP results in emphasizing the importance of relatively lower-level somatosensory areas in the representation of our body schema by coding not only the current position of our hands, but also the current spatial relationship of different body parts to each other, both in anatomical and external coordinates.

28.3.7. Summary

The second part of this chapter focused on the influence of the body and the body schema on multisensory processing. We started showing that body posture can be used to calibrate the spatial relationship between the senses, and we discussed that the brain weighs information from the different senses depending on their reliability. Such statistically optimal integrational processes may also be at the heart of the phenomena presented in the first part of the chapter, for example, the rubber hand illusion. The remainder of the chapter focused on multisensory spatial processing, starting out with the evidence for a special representation of the space directly around our body, demonstrating the link between the body schema and multisensory spatial processing. We showed that the peripersonal space is represented not only for the hands, but also for other body parts, and that not all experimental results can be explained by the common notion of the peripersonal space being represented simply by tactile RFs on a body part with a matching visual RF. We then showed that the body schema is also important in multisensory processing, but also in purely tactile processing, in that tactile locations are automatically remapped into external spatial coordinates. These external coordinates are closely related to the visual modality, but extend beyond the current visual field into the space that cannot be seen. Finally, ERPs were shown to be modulated both by anatomical and external coordinate frames. This highlights that although in some situations tactile locations seem to be fully remapped into purely external coordinates, the original, anatomical location of the touch is never quite forgotten. Such concurrent representations of both anatomical and external location seem useful in the context of action control. For example, to fend off a dangerous object—be it an insect ready to sting or the hand of an adversary who has grabbed one’s arm—it is crucial to know which limb can be chosen to initiate the defensive action, but also where the action must be guided in space. Thus, when the right arm has been grabbed, one cannot use this arm to strike against the opponent—and this is independent of the current external location of the captured arm. However, once it has been determined which arm is free for use in a counterattack, it becomes crucial to know where (in space) this arm should strike to fend off the attacker.

28.4. CONCLUSION

Our different senses enable us to perceive and act toward the environment. However, they also enable us to perceive ourselves and, foremost, our body. Because we can move in many different ways, our brain must keep track of our current posture at all times to guide actions effectively. However, the brain is also surprisingly flexible with respect to representing what it assumes to belong to the body at any given point in time, and about the body’s current shape. One of the main principles of the brain’s body processing seems to be the attempt to “make sense of all the senses” by integrating all available information. As we saw, this processing principle can lead to surprising illusions, such as the rubber hand illusion, the Pinocchio nose, or the feeling of being located outside the body, displaced toward a video image. As is often the case in psychology, these illusions also enlighten us about the brain processes in normal circumstances.

As much as multisensory information is important for the construction of our body schema, this body representation is in turn important for many instances of multisensory processing. Visual events in the peripersonal space are specially processed to protect our body, and our flexibility to move in many ways requires that the spatial locations of the different sensory modalities are transformed into a common reference system. None of these functions could work without some representation of the body’s current configuration.

REFERENCES

  1. Aglioti S, Smania N, Manfredi M, Berlucchi G. Disownership of left hand and objects related to it in a patient with right brain damage. Neuroreport. 1996;8:293–296. [PubMed: 9051798]
  2. Alais D, Burr D. The ventriloquist effect results from near-optimal bimodal integration. Curr Biol. 2004;14:257–262. [PubMed: 14761661]
  3. Andersen R.A, Cui H. Intention, action planning, and decision making in parietal–frontal circuits. Neuron. 2009;63:568–583. [PubMed: 19755101]
  4. Armel K.C, Ramachandran V. S. Projecting sensations to external objects: Evidence from skin conductance response. Proc R Soc Lond B Biol Sci. 2003;270:1499–1506. [PMC free article: PMC1691405] [PubMed: 12965016]
  5. Aspell J.E, Lenggenhager B, Blanke O. Keeping in touch with one's self: Multisensory mechanisms of self-consciousness. PLoS ONE. 2009;4:e6488. [PMC free article: PMC2715165] [PubMed: 19654862]
  6. Avillac M, Deneve S, Olivier E, Pouget A, Duhamel J. R. Reference frames for representing visual and tactile locations in parietal cortex. Nat Neurosci. 2005;8:941–949. [PubMed: 15951810]
  7. Azanon E, Soto-Faraco S. Alleviating the 'crossed-hands' deficit by seeing uncrossed rubber hands. Exp Brain Res. 2007;182:537–548. [PubMed: 17643239]
  8. Bakheit A.M, Roundhill S. Supernumerary phantom limb after stroke. Postgrad Med J. 2005;81:e2. [PMC free article: PMC1743235] [PubMed: 15749787]
  9. Batista A.P, Buneo C. A, Snyder L. H, Andersen R. A. Reach plans in eye-centered coordinates. Science. 1999;285:257–260. [PubMed: 10398603]
  10. Berlucchi G, Aglioti S. M. The body in the brain revisited. Exp Brain Res. 2010;200:25–35. [PubMed: 19690846]
  11. Bestmann S, Oliviero A, Voss M, Dechent P, Lopez-Dolado E, Driver J, Baudewig J. Cortical correlates of TMS-induced phantom hand movements revealed with concurrent TMS-fMRI. Neuropsychologia. 2006;44:2959–2971. [PubMed: 16889805]
  12. Blanke O, Landis T, Spinelli L, Seeck M. Out-of-body experience and autoscopy of neurological origin. Brain. 2004;127:243–258. [PubMed: 14662516]
  13. Blanke O, Metzinger T. Full-body illusions and minimal phenomenal selfhood. Trends Cogn Sci. 2009;13:7–13. [PubMed: 19058991]
  14. Blanke O, Ortigue S, Landis T, Seeck M. Stimulating illusory own-body perceptions. Nature. 2002;419:269–270. [PubMed: 12239558]
  15. Bolognini N, Maravita A. Proprioceptive alignment of visual and somatosensory maps in the posterior parietal cortex. Curr Biol. 2007;17:1890–1895. [PubMed: 17964160]
  16. Botvinick M. Neuroscience. Probing the neural basis of body ownership. Science. 2004;305:782–783. [PubMed: 15297651]
  17. Botvinick M, Cohen J. Rubber hands ‘feel’ touch that eyes see. Nature. 1998;391:756. [PubMed: 9486643]
  18. Breveglieri R, Galletti C, Monaco S, Fattori P. Visual, somatosensory, and bimodal activities in the macaque parietal area PEc. Cereb Cortex. 2008;18:806–816. [PubMed: 17660487]
  19. Brugger P, Kollias S. S, Muri R. M, Crelier G, Hepp-Reymond M. C, Regard M. Beyond remembering: Phantom sensations of congenitally absent limbs. Proc Natl Acad Sci U S A. 2000;97:6167–6172. [PMC free article: PMC18576] [PubMed: 10801982]
  20. Bruno N, Bertamini M. Haptic perception after a change in hand size. Neuropsychologia. 2010;48:1853–1856. [PubMed: 20122945]
  21. Buneo C.A, Jarvis M. R, Batista A. P, Andersen R. A. Direct visuomotor transformations for reaching. Nature. 2002;416:632–636. [PubMed: 11948351]
  22. Cardinali L, Brozzoli C, Farne A. Peripersonal space and body schema: Two labels for the same concept? Brain Topogr. 2009a;21:252–260. [PubMed: 19387818]
  23. Cardinali L, Frassinetti F, Brozzoli C, Urquizar C, Roy A. C, Farne A. Tool-use induces morphological updating of the body schema. Curr Biol. 2009b;19:R478–R479. [PubMed: 19549491]
  24. Cohen Y.E, Andersen R. A. Reaches to sounds encoded in an eye-centered reference frame. Neuron. 2000;27:647–652. [PubMed: 11055445]
  25. Cohen Y.E, Andersen R. A. A common reference frame for movement plans in the posterior parietal cortex. Nat Rev Neurosci. 2002;3:553–562. [PubMed: 12094211]
  26. Collins T, Schicke T, Röder B. Action goal selection and motor planning can be dissociated by tool use. Cognition. 2008;109:363–371. [PubMed: 19012884]
  27. Corradi-Dell'Acqua C, Tomasino B, Fink G. R. What is the position of an arm relative to the body? Neural correlates of body schema and body structural description. J Neurosci. 2009;29:4162–4171. [PubMed: 19339611]
  28. Cutting J. Study of anosognosia. J Neurol Neurosurg Psychiatry. 1978;41:548–555. [PMC free article: PMC493083] [PubMed: 671066]
  29. Daprati E, Sirigu A, Pradat-Diehl P, Franck N, Jeannerod M. Recognition of self-produced movement in a case of severe neglect. Neurocase. 2000;6:477–486.
  30. de Lange F. P, Helmich R. C, Toni I. Posture influences motor imagery: An fMRI study. Neuroimage. 2006;33:609–617. [PubMed: 16959501]
  31. de Vignemont F. Body schema and body image—pros and cons. Neuropsychologia. 2010;48:669–680. [PubMed: 19786038]
  32. di Pellegrino G, Frassinetti F. Direct evidence from parietal extinction of enhancement of visual attention near a visible hand. Curr Biol. 2000;10:1475–1477. [PubMed: 11102814]
  33. Dijkerman H.C, de Haan E. H. Somatosensory processes subserving perception and action. Behav Brain Sci. 2007;30:189–201. 201–239. discussion. [PubMed: 17705910]
  34. Duhamel J.R, Colby C. L, Goldberg M. E. Ventral intraparietal area of the macaque: Congruent visual and somatic response properties. J Neurophysiol. 1998;79:126–136. [PubMed: 9425183]
  35. Ehrsson H.H. The experimental induction of out-of-body experiences. Science. 2007;317:1048. [PubMed: 17717177]
  36. Ehrsson H.H, Spence C, Passingham R. E. That's my hand! Activity in premotor cortex reflects feeling of ownership of a limb. Science. 2004;305:875–877. [PubMed: 15232072]
  37. Ehrsson H.H, Wiech K, Weiskopf N, Dolan R. J, Passingham R. E. Threatening a rubber hand that you feel is yours elicits a cortical anxiety response. Proc Natl Acad Sci U S A. 2007;104:9828–9833. [PMC free article: PMC1887585] [PubMed: 17517605]
  38. Eimer M, Cockburn D, Smedley B, Driver J. Cross-modal links in endogenous spatial attention are mediated by common external locations: Evidence from event-related brain potentials. Exp Brain Res. 2001;139:398–411. [PubMed: 11534863]
  39. Eimer M, Forster B. Modulations of early somatosensory ERP components by transient and sustained spatial attention. Exp Brain Res. 2003;151:24–31. [PubMed: 12756516]
  40. Eimer M, Forster B, Van Velzen J. Anterior and posterior attentional control systems use different spatial reference frames: ERP evidence from covert tactile-spatial orienting. Psychophysiology. 2003;40:924–933. [PubMed: 14986845]
  41. Ernst M.O, Banks M. S. Humans integrate visual and haptic information in a statistically optimal fashion. Nature. 2002;415:429–433. [PubMed: 11807554]
  42. Fattori P, Kutz D. F, Breveglieri R, Marzocchi N, Galletti C. Spatial tuning of reaching activity in the medial parieto-occipital cortex (area V6A) of macaque monkey. Eur J Neurosci. 2005;22:956–972. [PubMed: 16115219]
  43. Fitzgerald P.J, Lane J. W, Thakur P. H, Hsiao S. S. Receptive field properties of the macaque second somatosensory cortex: Evidence for multiple functional representations. J Neurosci. 2004;24:11193–11204. [PMC free article: PMC1800879] [PubMed: 15590936]
  44. Fogassi L, Gallese V, Fadiga L, Luppino G, Matelli M, Rizzolatti G. Coding of peripersonal space in inferior premotor cortex (area F4) J Neurophysiol. 1996;76:141–157. [PubMed: 8836215]
  45. Freedman E.G, Sparks D. L. Activity of cells in the deeper layers of the superior colliculus of the rhesus monkey: Evidence for a gaze displacement command. J Neurophysiol. 1997;78:1669–1690. [PubMed: 9310452]
  46. Frot M, Mauguiere F. Timing and spatial distribution of somatosensory responses recorded in the upper bank of the sylvian fissure (SII area) in humans. Cereb Cortex. 1999;9:854–863. [PubMed: 10601004]
  47. Gallagher S. Body image and body schema: A conceptual clarification. J Mind Behav. 1986;7:541–554.
  48. Graziano M.S. Where is my arm? The relative role of vision and proprioception in the neuronal representation of limb position. Proc Natl Acad Sci U S A. 1999;96:10418–10421. [PMC free article: PMC17903] [PubMed: 10468623]
  49. Graziano M.S, Cooke D. F. Parieto-frontal interactions, personal space, and defensive behavior. Neuropsychologia. 2006;44:845–859. [PubMed: 16277998]
  50. Graziano M.S, Cooke D. F, Taylor C. S. Coding the location of the arm by sight. Science. 2000;290:1782–1786. [PubMed: 11099420]
  51. Graziano M.S, Gandhi S. Location of the polysensory zone in the precentral gyrus of anesthetized monkeys. Exp Brain Res. 2000;135:259–266. [PubMed: 11131511]
  52. Graziano M.S, Hu X. T, Gross C. G. Visuospatial properties of ventral premotor cortex. J Neurophysiol. 1997;77:2268–2292. [PubMed: 9163357]
  53. Graziano M.S, Reiss L. A, Gross C. G. A neuronal representation of the location of nearby sounds. Nature. 1999;397:428–430. [PubMed: 9989407]
  54. Graziano M.S, Taylor C. S, Moore T. Complex movements evoked by microstimulation of precentral cortex. Neuron. 2002;34:841–851. [PubMed: 12062029]
  55. Graziano M.S, Yap G. S, Gross C. G. Coding of visual space by premotor neurons. Science. 1994;266:1054–1057. [PubMed: 7973661]
  56. Grefkes C, Fink G. R. The functional organization of the intraparietal sulcus in humans and monkeys. J Anat. 2005;207:3–17. [PMC free article: PMC1571496] [PubMed: 16011542]
  57. Halligan P.W, Marshall J. C, Wade D. T. Three arms: A case study of supernumerary phantom limb after right hemisphere stroke. J Neurol Neurosurg Psychiatry. 1993;56:159–166. [PMC free article: PMC1014815] [PubMed: 8437005]
  58. Halligan P.W, Marshall J. C, Wade D. T. Unilateral somatoparaphrenia after right hemisphere stroke: A case description. Cortex. 1995;31:173–182. [PubMed: 7781314]
  59. Harris C.M, Wolpert D. M. Signal-dependent noise determines motor planning. Nature. 1998;394:780–784. [PubMed: 9723616]
  60. Heed T, Röder B. Common anatomical and external coding for hands and feet in tactile attention: Evidence from event-related potentials. J Cogn Neurosci. 2010;22:184–202. [PubMed: 19199399]
  61. Holmes N.P, Calvert G. A, Spence C. Extending or projecting peripersonal space with tools? Multisensory interactions highlight only the distal and proximal ends of tools. Neurosci Lett. 2004;372:62–67. [PubMed: 15531089]
  62. Holmes N.P, Calvert G. A, Spence C. Tool use changes multisensory interactions in seconds: Evidence from the crossmodal congruency task. Exp Brain Res. 2007;183:465–476. [PMC free article: PMC2084481] [PubMed: 17665178]
  63. Holmes N.P, Snijders H. J, Spence C. Reaching with alien limbs: Visual exposure to prosthetic hands in a mirror biases proprioception without accompanying illusions of ownership. Percept Psychophys. 2006;68:685–701. [PMC free article: PMC1564193] [PubMed: 16933431]
  64. Holmes N.P, Spence C. The body schema and multisensory representation(s) of peripersonal space. Cogn Process. 2004;5:94–105. [PMC free article: PMC1350799] [PubMed: 16467906]
  65. Holmes N.P, Spence C, Hansen P. C, Mackay C. E, Calvert G. A. The multisensory attentional consequences of tool use: A functional magnetic resonance imaging study. PLoS ONE. 2008;3:e3502. [PMC free article: PMC2567039] [PubMed: 18958150]
  66. Hötting K, Röder B. Hearing cheats touch, but less in congenitally blind than in sighted individuals. Psychol Sci. 2004;15:60–64. [PubMed: 14717833]
  67. Ionta S, Blanke O. Differential influence of hands posture on mental rotation of hands and feet in left and right handers. Exp Brain Res. 2009;195:207–217. [PubMed: 19326106]
  68. Ionta S, Fourkas A. D, Fiorio M, Aglioti S. M. The influence of hands posture on mental rotation of hands and feet. Exp Brain Res. 2007;183:1–7. [PubMed: 17643238]
  69. Iriki A, Tanaka M, Iwamura Y. Coding of modified body schema during tool use by macaque postcentral neurones. Neuroreport. 1996;7:2325–2330. [PubMed: 8951846]
  70. Kammers M.P, de Vignemont F, Verhagen L, Dijkerman H. C. The rubber hand illusion in action. Neuropsychologia. 2009a;47:204–211. [PubMed: 18762203]
  71. Kammers M.P, Kootker J. A, Hogendoorn H, Dijkerman H. C. How many motoric body representations can we grasp? Exp Brain Res. 2009b;202:203–212. [PMC free article: PMC2845887] [PubMed: 20039029]
  72. Kammers M.P, Verhagen L. O, Dijkerman L. H. C, Hogendoorn H, De Vignemont F, Schutter D. J. Is this hand for real? Attenuation of the rubber hand illusion by transcranial magnetic stimulation over the inferior parietal lobule. J Cogn Neurosci. 2009c;21:1311–1320. [PubMed: 18752397]
  73. Karnath H.O, Ferber S, Himmelbach M. Spatial awareness is a function of the temporal not the posterior parietal lobe. Nature. 2001;411:950–953. [PubMed: 11418859]
  74. Kase C.S, Troncoso J. F, Court J. E, Tapia J. F, Mohr J. P. Global spatial disorientation. Clinicopathologic correlations. J Neurol Sci. 1977;34:267–278. [PubMed: 925713]
  75. Kitazawa S. Where conscious sensation takes place. Conscious Cogn. 2002;11:475–477. [PubMed: 12435379]
  76. Kobor I, Furedi L, Kovacs G, Spence C, Vidnyanszky Z. Back-to-front: Improved tactile discrimination performance in the space you cannot see. Neurosci Lett. 2006;400:163–167. [PubMed: 16516383]
  77. Körding K.P, Wolpert D. M. Bayesian integration in sensorimotor learning. Nature. 2004;427:244–247. [PubMed: 14724638]
  78. Lackner J.R. 1988Some proprioceptive influences on the perceptual representation of body shape and orientation Brain 111(Pt 2):281–297. [PubMed: 3378137]
  79. Lackner J.R, DiZio P. A. Aspects of body self-calibration. Trends Cogn Sci. 2000;4:279–288. [PubMed: 10859572]
  80. Lackner J.R, Shenker B. Proprioceptive influences on auditory and visual spatial localization. J Neurosci. 1985;5:579–583. [PubMed: 3973685]
  81. Lacroix R, Melzack R, Smith D, Mitchell N. Multiple phantom limbs in a child. Cortex. 1992;28:503–507. [PubMed: 1395650]
  82. Ladavas E. Functional and dynamic properties of visual peripersonal space. Trends Cogn Sci. 2002;6:17–22. [PubMed: 11849611]
  83. Ladavas E, di Pellegrino G, Farne A, Zeloni G. Neuropsychological evidence of an integrated visuotactile representation of peripersonal space in humans. J Cogn Neurosci. 1998;10:581–589. [PubMed: 9802991]
  84. Lenggenhager B, Tadi T, Metzinger T, Blanke O. Video ergo sum: manipulating bodily self-consciousness. Science. 2007;317:1096–1099. [PubMed: 17717189]
  85. Lloyd D.M, Shore D. I, Spence C, Calvert G. A. Multisensory representation of limb position in human premotor cortex. Nat Neurosci. 2003;6:17–18. [PubMed: 12483217]
  86. Longo M.R, Schuur F, Kammers M. P, Tsakiris M, Haggard P. What is embodiment? A psychometric approach. Cognition. 2008;107:978–998. [PubMed: 18262508]
  87. Lotze M, Flor H, Grodd W, Larbig W, Birbaumer N. Phantom movements and pain. An fMRI study in upper limb amputees. Brain. 2001;124:2268–2277. [PubMed: 11673327]
  88. Macaluso E, Frith C. D, Driver J. Crossmodal spatial influences of touch on extrastriate visual areas take current gaze direction into account. Neuron. 2002;34:647–658. [PubMed: 12062047]
  89. Macaluso E, Frith C. D, Driver J. Delay activity and sensory-motor translation during planned eye or hand movements to visual or tactile targets. J Neurophysiol. 2007;98:3081–3094. [PubMed: 17898151]
  90. Makin T.R, Holmes N. P, Zohary E. Is that near my hand? Multisensory representation of peripersonal space in human intraparietal sulcus. J Neurosci. 2007;27:731–740. [PubMed: 17251412]
  91. Mangun G.R, Hillyard S. A. Spatial gradients of visual attention: Behavioral and electrophysiological evidence. Electroencephalogr Clin Neurophysiol. 1988;70:417–428. [PubMed: 2460315]
  92. Maravita A, Iriki A. Tools for the body (schema) Trends Cog Sci. 2004;8:79–86. [PubMed: 15588812]
  93. Maravita A, Spence C, Kennett S, Driver J. Tool-use changes multimodal spatial interactions between vision and touch in normal humans. Cognition. 2002;83:B34. [PubMed: 11869727]
  94. Marino B.F, Stucchi N, Nava E, Haggard P, Maravita A. Distorting the visual size of the hand affects hand pre-shaping during grasping. Exp Brain Res. 2010;202:499–505. [PubMed: 20044746]
  95. McGonigle D. J, Hanninen R, Salenius S, Hari R, Frackowiak R. S, Frith C. D. Whose arm is it anyway? An fMRI case study of supernumerary phantom limb. Brain. 2002;125:1265–1274. [PubMed: 12023315]
  96. Metzinger T. Why are out-of-body experiences interesting for philosophers? The theoretical relevance of OBE research. Cortex. 2009;45:256–258. [PubMed: 19046743]
  97. Misaki M, Matsumoto E, Miyauchi S. Dorsal visual cortex activity elicited by posture change in a visuo- tactile matching task. Neuroreport. 2002;13:1797–1800. [PubMed: 12395126]
  98. Mort D.J, Malhotra P, Mannan S. K, Rorden C, Pambakian A, Kennard C, Husain M. The anatomy of visual neglect. Brain. 2003;126:1986–1997. [PubMed: 12821519]
  99. Mullette-Gillman O.A, Cohen Y. E, Groh J. M. Eye-centered, head-centered, and complex coding of visual and auditory targets in the intraparietal sulcus. J Neurophysiol. 2005;94:2331–2352. [PubMed: 15843485]
  100. Naito E. Sensing limb movements in the motor cortex: How humans sense limb movement. Neuroscientist. 2004;10:73–82. [PubMed: 14987450]
  101. Naito E, Roland P. E, Ehrsson H. H. I feel my hand moving: A new role of the primary motor cortex in somatic perception of limb movement. Neuron. 2002;36:979–988. [PubMed: 12467600]
  102. Obayashi S, Tanaka M, Iriki A. Subjective image of invisible hand coded by monkey intraparietal neurons. Neuroreport. 2000;11:3499–3505. [PubMed: 11095507]
  103. Pagel B, Heed T, Röder B. Change of reference frame for tactile localization during child development. Dev Sci. 2009;12:929–937. [PubMed: 19840048]
  104. Paqueron X, Leguen M, Rosenthal D, Coriat P, Willer P. J. C, Danziger N. The phenomenology of body image distortions induced by regional anaesthesia. Brain. 2003;126:702–712. [PubMed: 12566290]
  105. Pare M, Wurtz R. H. Monkey posterior parietal cortex neurons antidromically activated from superior colliculus. J Neurophysiol. 1997;78:3493–3497. [PubMed: 9405568]
  106. Pare M, Wurtz R. H. Progression in neuronal processing for saccadic eye movements from parietal cortex area LIP to superior colliculus. J Neurophysiol. 2001;85:2545–2562. [PubMed: 11387400]
  107. Parsons L.M. Imagined spatial transformations of one's hands and feet. Cogn Psychol. 1987;19:178–241. [PubMed: 3581757]
  108. Pavani F, Spence C, Driver J. Visual capture of touch: Out-of-the-body experiences with rubber gloves. Psychol Sci. 2000;11:353–359. [PubMed: 11228904]
  109. Pellijeff A, Bonilha L, Morgan P. S, McKenzie K, Jackson S. R. Parietal updating of limb posture: An event-related fMRI study. Neuropsychologia. 2006;44:2685–2690. [PubMed: 16504223]
  110. Pesaran B, Nelson M. J, Andersen R. A. Dorsal premotor neurons encode the relative position of the hand, eye, and goal during reach planning. Neuron. 2006;51:125–134. [PMC free article: PMC3066049] [PubMed: 16815337]
  111. Pouget A, Deneve S, Duhamel J. R. A computational perspective on the neural basis of multisensory spatial representations. Nat Rev Neurosci. 2002;3:741–747. [PubMed: 12209122]
  112. Press C, Heyes C, Haggard P, Eimer M. Visuotactile learning and body representation: An ERP study with rubber hands and rubber objects. J Cogn Neurosci. 2008;20:312–323. [PMC free article: PMC2373573] [PubMed: 18275337]
  113. Previc F.H. The neuropsychology of 3-D space. Psychol Bull. 1998;124:123–164. [PubMed: 9747184]
  114. Ramachandran V.S. 1993Behavioral and magnetoencephalographic correlates of plasticity in the adult human brain Proc Natl Acad Sci U S A 9010413–10420. [PMC free article: PMC47787] [PubMed: 8248123]
  115. Ramachandran V.S, Hirstein W. Three laws of qualia—what neurology tells us about the biological functions of consciousness, qualia and the self. J Consciousness Stud. 1997;4:429–458.
  116. Ramachandran V.S, Hirstein W. The perception of phantom limbs. The D. O. Hebb lecture. Brain. 1998;121(Pt 9):1603–1630. [PubMed: 9762952]
  117. Rizzolatti G, Luppino G, Matelli M. The organization of the cortical motor system: New concepts. Electroencephalogr Clin Neurophysiol. 1998;106:283–296. [PubMed: 9741757]
  118. Rizzolatti G, Scandolara C, Matelli M, Gentilucci M. Afferent properties of periarcuate neurons in macaque monkeys. I. Somatosensory responses. Behav Brain Res. 1981a;2:125–146. [PubMed: 7248054]
  119. Rizzolatti G, Scandolara C, Matelli M, Gentilucci M. Afferent properties of periarcuate neurons in macaque monkeys: II. Visual responses. Behav Brain Res. 1981b;2:147–163. [PubMed: 7248055]
  120. Röder B, Kusmierek A, Spence C, Schicke T. Developmental vision determines the reference frame for the multisensory control of action. Proc Natl Acad Sci U S A. 2007;104:4753–4758. [PMC free article: PMC1838672] [PubMed: 17360596]
  121. Röder B, Rösler F, Spence C. Early vision impairs tactile perception in the blind. Curr Biol. 2004;14:121–124. [PubMed: 14738733]
  122. Roux F.E, Lotterie J. A, Cassol E, Lazorthes Y, Sol J. C, Berry I. Cortical areas involved in virtual movement of phantom limbs: comparison with normal subjects. Neurosurgery. 2003;53:1342–1352. [PubMed: 14633300]
  123. Saadah E.S, Melzack R. Phantom limb experiences in congenital limb-deficient adults. Cortex. 1994;30:479–485. [PubMed: 7805388]
  124. Sakata H, Takaoka Y, Kawarasaki A, Shibutani H. Somatosensory properties of neurons in the superior parietal cortex (area 5) of the rhesus monkey. Brain Res. 1973;64:85–102. [PubMed: 4360893]
  125. Schaefer M, Flor H, Heinze H. J, Rotte M. Morphing the body: Illusory feeling of an elongated arm affects somatosensory homunculus. Neuroimage. 2007;36:700–705. [PubMed: 17499523]
  126. Scherberger H, Goodale M. A, Andersen R. A. Target selection for reaching and saccades share a similar behavioral reference frame in the macaque. J Neurophysiol. 2003;89:1456–1466. [PubMed: 12612028]
  127. Schicke T, Bauer F, Röder B. Interactions of different body parts in peripersonal space: how vision of the foot influences tactile perception at the hand. Exp Brain Res. 2009;192:703–715. [PubMed: 18841353]
  128. Schicke T, Röder B. Spatial remapping of touch: Confusion of perceived stimulus order across hand and foot. Proc Natl Acad Sci U S A. 2006;103:11808–11813. [PMC free article: PMC1544251] [PubMed: 16864789]
  129. Schlack A, Sterbing-D’Angelo S. J, Hartung K, Hoffmann K. P, Bremmer F. Multisensory space representations in the macaque ventral intraparietal area. J Neurosci. 2005;25:4616–4625. [PubMed: 15872109]
  130. Sellal F, Renaseau-Leclerc C, Labrecque R. The man with 6 arms. An analysis of supernumerary phantom limbs after right hemisphere stroke. Rev Neurol (Paris) 1996;152:190–195. [PubMed: 8761629]
  131. Shams L, Kamitani Y, Shimojo S. Visual illusion induced by sound. Brain Res Cogn Brain Res. 2002;14:147–152. [PubMed: 12063138]
  132. Shore D.I, Spry E, Spence C. Confusing the mind by crossing the hands. Brain Res Cogn Brain Res. 2002;14:153–163. [PubMed: 12063139]
  133. Simmel M.L. The reality of phantom sensations. Soc Res. 1962;29:337–356.
  134. Spence C, Pavani F, Driver J. Spatial constraints on visual-tactile cross-modal distractor congruency effects. Cogn Affect Behav Neurosci. 2004a;4:148–169. [PubMed: 15460922]
  135. Spence C, Pavani F, Maravita A, Holmes N. Multisensory contributions to the 3-D representation of visuotactile peripersonal space in humans: Evidence from the crossmodal congruency task. J Physiol Paris. 2004b;98:171–189. [PubMed: 15477031]
  136. Stein B.E, Wallace M. T, Meredith M. A. Neural mechanisms mediating attention and orientation to multisensory cues. In: Gazzaniga M. S, editor. The Cognitive Neurosciences. Cambridge, MA: MIT Press, Bradford Book; 1995. pp. 683–702.
  137. Stricanne B, Andersen R. A, Mazzoni P. Eye-centered, head-centered, and intermediate coding of remembered sound locations in area LIP. J Neurophysiol. 1996;76:2071–2076. [PubMed: 8890315]
  138. Stuphorn V, Bauswein E, Hoffmann K. P. Neurons in the primate superior colliculus coding for arm movements in gaze-related coordinates. J Neurophysiol. 2000;83:1283–1299. [PubMed: 10712456]
  139. Tark K.J, Curtis C. E. Persistent neural activity in the human frontal cortex when maintaining space that is off the map. Nat Neurosci. 2009;12:1463–1468. [PMC free article: PMC3171293] [PubMed: 19801987]
  140. Trommärshauser J, Maloney L. T, Landy M. S. Statistical decision theory and trade-offs in the control of motor response. Spat Vis. 2003;16:255–275. [PubMed: 12858951]
  141. Tsakiris M. My body in the brain: a neurocognitive model of body-ownership. Neuropsychologia. 2010;48:703–712. [PubMed: 19819247]
  142. Tsakiris M, Hesse M. D, Boy C, Haggard P, Fink G. R. Neural signatures of body ownership: A sensory network for bodily self-consciousness. Cereb Cortex. 2007;17:2235–2244. [PubMed: 17138596]
  143. Türker K.S, Yeo P. L, Gandevia S. C. Perceptual distortion of face deletion by local anaesthesia of the human lips and teeth. Exp Brain Res. 2005;165:37–43. [PubMed: 15818498]
  144. Walton M.M, Bechara B, Gandhi N. J. Role of the primate superior colliculus in the control of head movements. J Neurophysiol. 2007;98:2022–2037. [PMC free article: PMC3646069] [PubMed: 17581848]
  145. Xing J, Andersen R. A. Models of the posterior parietal cortex which perform multimodal integration and represent space in several coordinate frames. J Cogn Neurosci. 2000;12:601–614. [PubMed: 10936913]
  146. Yamamoto S, Kitazawa S. Reversal of subjective temporal order due to arm crossing. Nat Neurosci. 2001a;4:759–765. [PubMed: 11426234]
  147. Yamamoto S, Kitazawa S. Sensation at the tips of invisible tools. Nat Neurosci. 2001b;4:979–980. [PubMed: 11544483]
  148. Yamamoto S, Moizumi S, Kitazawa S. Referral of tactile sensation to the tips of L-shaped sticks. J Neurophysiol. 2005;93:2856–2863. [PubMed: 15634708]
  149. Yue Z, Bischof G. N, Zhou X, Spence C, Röder B. Spatial attention affects the processing of tactile and visual stimuli presented at the tip of a tool: An event-related potential study. Exp Brain Res. 2009;193:119–128. [PubMed: 18936924]
Copyright © 2012 by Taylor & Francis Group, LLC.
Bookshelf ID: NBK92834PMID: 22593862

Views

  • PubReader
  • Print View
  • Cite this Page

Other titles in this collection

Related information

  • PMC
    PubMed Central citations
  • PubMed
    Links to PubMed

Similar articles in PubMed

See reviews...See all...

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...