Scripta Varia

Ghost in the Machine

Olaf Blanke*

Neuroscience research has investigated some of the major mechanisms of conscious processing (i.e. Koch, 2004; Dehaene and Changeux, 2011). Influential data regarding the neural correlates of consciousness came from observations in neurological patients (i.e. Pöppel et al., 1973; Weiskrantz et al., 1974; Bisiach et al., 1979), extended by psychophysical research and brain imaging (i.e. Dehaene and Changeux, 2011; Dehaene, this issue). Although these studies have led to a better understanding of perceptual consciousness, they have mostly targeted visual consciousness, whereas conscious and unconscious perception for other senses has been underexplored, despite its importance for consciousness given its multisensory and integrated nature (Faivre et al., 2015, 2017).

Recently, consciousness research has targeted the observer, or subject of conscious experience, that was not accounted for in these models of visual-perceptual consciousness, although the self as the subject of conscious experience is a fundamental property of perceptual consciousness and some have even argued that a subject pervades all consciousness experience. Thus, conscious perception is not only a multisensory experience of external objects, but also includes the experience of a unitary subject. In what follows I summarize what is known about the brain mechanisms that are associated with the ghost in the machine, the feeling that the objects of conscious perception seem to be experienced by somebody, by a self. Many notions of self have been defined and studied in the neurosciences. Although many different classifications have been proposed for the self, I will here highlight only two kinds of self (cognitive self, conscious self). Both are of relevance for engineering and AI, but only one is fundamentally relevant for consciousness.

The multidimensional cognitive self includes self-related cognitions ranging from memory, to language, to imagery, theory of mind, and many other cognitive functions and overlaps with Dan Dennett’s narrative self and Ulric Neisser’s extended self (i.e. Neisser, 1988; Dennett, 1991; Gallagher, 2000). For example, the capacity to attribute mental states to self and others in order to predict and explain behavior (theory of mind) is often listed as part of the cognitive self. Such research has targeted brain mechanisms that may distinguish theory of mind for self and other, for example the different sensory inputs and cognitive systems processing signals relevant for attributing mental states to self and others. Contrary to our intuition that we know our own mind better than those of others there is, however, much evidence that brain mechanisms are actually quite similar and shared when attributing mental states to oneself or to other people (i.e. Gopnik and Meltzhoff, 1994). Another self-relevant cognitive function is memory. Already John Locke proposed that memory processes are a crucial building block for the self. Memory may ascertain continuity of the self across time (and space) and recent studies have defined the self-relevant brain mechanisms of mental time travel (i.e. Arzy et al., 2008; Schacter et al., 2007, 2012) or of autobiographical memory (Levine et al., 1998). Comparable to the systems dedicated to verbal or visuo-spatial memories, the remembered self provides humans with the capacity to store and recall past own life events, to imagine life events from one’s past, and to imagine and predict future life events (Arzy et al., 2008; Schacter et al., 2007, 2012). Theory of mind and memory related aspects of the self are highly conscious self-representations and can be mentally accessed as any other cognitive operation. Future machines possessing capacities related to the cognitive self may thus be considered more likely as conscious as compared to machines not having such cognitive functions implemented. Advancing neuroscientific understanding of the cognitive self and implementing it in machines will probably make these machines more powerful. However, they are not likely to be conscious machines, because despite the importance of these systems for cognition and despite the mental access they may provide to such cognitive self-representations, they are distinct from a fundamental central processing system mediating the phenomenal self: the conscious self.

What kind of system should be implemented in a machine so that it is more likely to be phenomenological consciousness? Recent evidence, ranging from clinical to experimental data, suggests that the processing of specific bodily signals is what is needed to have the unitary experience of being the subject of conscious experience. This conscious self is based on the processing of multisensory bodily (and motor) signals. The conscious self is fundamentally based on the processing of trunk-centered multisensory signals, representing the person’s body as a global and unitary entity (Blanke, 2012) and characterized by congruent self-identification, self-location, and first-person perspective. Experimental studies in healthy subjects used different visuo-tactile and visuo-vestibular stimulations for the induction of global changes in the conscious self, such as ‘full-body’, ‘out-of-body’, or ‘body-swap’ illusions (Ehrsson, 2007) (Lenggenhager et al., 2007) (Petkova and Ehrsson, 2008). Typically in these paradigms, tactile stimulations are repeatedly applied for several minutes to the back or chest of a participant who is being filmed and so simultaneously views (on a virtual reality headset) the stroking of a human body or avatar in real-time. When exposed to such stimulations, changes in the conscious self occur and participants self-identify with the seen virtual body and have changes in self-location towards the position of the virtual body (and thus not or less with their physical body). Additional visuo-vestibular conflicts may also lead to changes in the experienced direction of the subjective first-person perspective (i.e. Ionta et al., 2011; Pfeiffer et al., 2016). Similar effects on the conscious self have also been observed when integration of interoceptive bodily signals is tested (Aspell et al., 2013), linking the present concept of the conscious self to interoception-based self models (i.e. Craig, 2002; Park and Tallon-Baudry, 2014).

Several variants of such multisensory bodily illusions exist and were conceived to mimic alterations of the conscious self that have been reported by neurological patients. Two such clinical conditions that are both based on abnormal multisensory integration of trunk-centered signals are most relevant and are out-of-body experiences and heautoscopy. Out-of-body experiences are characterized by a first-person perspective that is not non body-centered (i.e. the conscious self is experienced as being outside one’s bodily borders at an elevated position; Blanke et al., 2004; De Ridder et al., 2007). Heautoscopy is characterized by conscious bilocation and reduplication of the conscious self (i.e. the experience of two simultaneously conscious selves that are experienced at two distinct spatial locations; Heydrich and Blanke, 2014). Clearly, visual illusions (i.e. Ponzo or Ebbinghaus-Titchener illusion) are important tools to refine models of visual perception and consciousness (Eagleman, 2001). Likely, the highlighted multisensory bodily illusions will advance models of the conscious self. Computational implementations of the basic laws or constraints of the conscious self (i.e. proprioception, body-related visual information, peripersonal space, and embodiment; Blanke et al., 2015) may thus be systems with access to body-centered multisensory self-representations and may hence enable forms of phenomenal self-consciousness: a ghost in the machine with a tendency towards mind-body dualism.

The neuroscientific notions of cognitive self and conscious self relate differently to an old dichotomy, pursued since the dawn of philosophy, between the easy and the hard problem of consciousness (Chalmers, 1996) or between access and phenomenal consciousness (Block, 1995). It will be a fascinating neuroscience question to pursue whether the conscious self may allow us to make advances on phenomenal consciousness and the hard problem. Does the conscious self pervade all conscious experience (visual, auditory, cognitive, emotional) and underlies the integrated and unitary conscious experience of being a conscious self or subject as well as conscious experience of being a subject with a certain qualitative conscious experience (say the conscious experience of the blue sky). In other words, do the abovementioned brain mechanisms of the conscious self that are based on trunk-centered global bodily signals also play a role in visual and auditory consciousness? In my opinion too much speculation rather than experimentation and modeling has prevailed in the past on this topic. With many of the mechanisms of perceptual consciousness as well as those of the conscious self well established such studies seem possible. Some data suggest that changes in visual consciousness are tightly coupled with changes in bodily self-relevant signals (Park et al., 2014; Salomon et al., 2016; Faivre et al., 2016). Whether such phenomenal aspects of visual or auditory consciousness are mediated via a higher-order cognitive self representation or the first-order conscious self representation should open ground for fascinating research, including computational approaches. Such work in humans will require further experimental improvements as well as advances in virtual reality, augmented reality, and robotics/haptics. Such technology needs to be tailored to the needs of cognitive neuroscience and brain imaging, advancing towards a more systematic and fine-grained control of bodily states in humans (i.e. Rognini and Blanke, 2016). Whether machines one day will report out-of-body experiences and heautoscopy or be disposed to psychosis (hallucinations and delusions that have been linked to altered processing of the conscious self) remains to be seen, but may be likely.


* Laboratory of Cognitive Neuroscience, Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne, Switzerland; Center for Neuroprosthetics, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne, Switzerland; Department of Neurology, University Hospital, Geneva, Switzerland.



Arzy S., Molnar-Szakacs I., Blanke O. (2008) Self in time: imagined self-location influences neural activity related to mental time travel. J Neurosci. 28(25): 6502-7.

Aspell, J.E., Heydrich, L., Herbelin, B., & Blanke, O. (2013). Turning body and self inside out. Cardio-visual illumination modulates bodily self consciousness and tactile perception. Psychol Sci. 24, 2445-2453.

Blanke, O., Landis, T., Spinelli, L., & Seeck, M. (2004). Out-of-body experience and autoscopy of neurological origin. Brain, 127, 243-258.

Blanke, O. (2012) Multisensory brain mechanisms of bodily self-consciousness. Nat Rev Neurosci, 13, 556-71.

Blanke O., Slater M., Serino A. (2015) Behavioral, Neural, and Computational Principles of Bodily Self-Consciousness. Neuron. 88(1): 145-66.

Block, N. (1995) On a confusion about a function of consciousness. Beh Brain Sci 18: 227-287.

Bisiach, E., Luzziatti, C. & Perani, D. 1979. Unilateral neglect, representational schema and consciousness. Brain: a journal of neurology, 102, 609-618.

Chalmers, D. (1996). The conscious mind. MIT Press.

Craig, A.D. 2002. How do you feel? Interoception: the sense of the physiological condition of the body. Nature Reviews Neuroscience, 3, 655-666.

Dehaene, S., & Changeux, J.P. (2011). Experimental and theoretical approaches to conscious processing. Neuron, 70, 200-227.

Dennett, D.C. (1991). Consciousness explained. New York: Penguin.

De Ridder D., Van Laere K., Dupont P., Menovsky T., Van de Heyning P. (2007) Visualizing out-of-body experience in the brain. N Engl J Med. 357(18): 1829-33.

Ehrsson, H.H. (2007). The experimental induction of out-of-body experiences. Science, 317, 1048.

Faivre, N., Salomon, R., & Blanke, O. (2015). Visual Consciousness and Bodily-Self Consciousness. Current Opinion in Neurology, 28(1), 23-28.

Faivre N., Arzi, A., Lunghi, S., & Salomon, S. (2017) Consciousness is more than meets the eye: a call for a multisensory study of subjective experience. Neuroscience of Consciousness. 3(1): nix003.

Faivre N., Dönz J., Scandola M., Dhanis H., Bello Ruiz J., Bernasconi F., Salomon R., Blanke O. (2017) Self-Grounded Vision: Hand Ownership Modulates Visual Location through Cortical β and γ Oscillations. J Neurosci. 2017 Jan 4;37(1):11-22.

Gallagher, S. (2000). Philosophical conceptions of the self: implications for cognitive science. Trends in Cognitive Sciences.

Heydrich, L., & Blanke, O. (2013). Distinct illusory own-body perceptions caused by damage to posterior insula and extrastriate cortex. Brain, 136, 790-803.

Ionta, S., Heydrich, L., Lenggenhager, B., Mouthon, M., Fornari, E., Chapuis, D., … Blanke, O. (2011). Multisensory mechanisms in temporo-parietal cortex support self-location and first-person perspective. Neuron, 70, 363-374.

Koch, C. (2004). The quest of consciousness. A neurobiological approach. Englewood, CO: Roberts.

Lenggenhager, B., Tadi, T., Metzinger, T., & Blanke, O. (2007). Video ergo sum: Manipulating bodily self-consciousness. Science, 317, 1096-1099.

Levine B., Black S.E., Cabeza R., Sinden M., Mcintosh A.R., Toth J.P., Tulving E., Stuss D.T. (1998) Episodic memory and the self in a case of isolated retrograde amnesia. Brain. 121:1951-73.

Park H.D., Correia S., Ducorps A., Tallon-Baudry C. (2014) Spontaneous fluctuations in neural responses to heartbeats predict visual detection. Nat Neurosci. 17(4):612-8.

Park HD, Tallon-Baudry C. (2014) The neural subjective frame: from bodily signals to perceptual consciousness. Philos Trans R Soc Lond B Biol Sci. 369 (1641): 20130208.

Petkova, V.I., & Ehrsson, H.H. (2008). If I were you: Perceptual illusion of body swapping. PLoS ONE, 3, e3832.

Pfeiffer, Christian; Grivaz, Petr; Herbelin, Bruno; Serino, Andrea; Blanke, Olaf (2016) Visual gravity contributes to subjective first-person perspective. Neuroscience of Consciousness. (1): niw006. doi: 10.1093/nc/niw006

Pöppel E., Held R., Frost D. (1973) Residual visual function after brain wounds involving the central visual pathways in man. Nature. 243(5405): 295-6.

Rognini G, Blanke O. (2016) Cognetics: Robotic Interfaces for the Conscious Mind. Trends Cogn Sci. 20(3): 162-4.

Salomon R., Ronchi R., Dönz J., Bello-Ruiz J., Herbelin B., Martet R., Faivre N., Schaller K., Blanke O. (2016) The Insula Mediates Access to Awareness of Visual Stimuli Presented Synchronously to the Heartbeat. J Neurosci. 2016 May 4;36(18):5115-27.

Schacter D.L., Addis D.R., Buckner R.L. (2007) Remembering the past to imagine the future: the prospective brain. Nat Rev Neurosci. 2007 Sep; 8(9):657-61.

Weiskrantz L., Warrington E.K., Sanders M.D., Marshall J. (1974) Visual capacity in the hemianopic field following a restricted occipital ablation. Brain. 97(4): 709-28.



Europe: Refugees are our Brothers and Sisters

Proceedings of the Mayors' Summit Europe: Refugees are our Brothers and Sisters 9-10 December... Read more