Abstract
Classic Bayesian models of perceptual inference describe how an ideal observer would integrate ‘unisensory’ measurements (multisensory integration) and attribute sensory signals to their origin(s) (causal inference). However, in the brain, sensory signals are always received in the context of a multisensory bodily state—namely, in combination with other senses. Moreover, sensory signals from both interoceptive sensing of one's own body and exteroceptive sensing of the world are highly interdependent and never occur in isolation. Thus, the observer must fundamentally determine whether each sensory observation is from an external (versus internal, self-generated) source to even be considered for integration. Critically, solving this primary causal inference problem requires knowledge of multisensory and sensorimotor dependencies. Thus, multisensory processing is needed to separate sensory signals. These multisensory processes enable us to simultaneously form a sense of self and form distinct perceptual decisions about the external world. In this opinion paper, we review and discuss the similarities and distinctions between multisensory decisions underlying the sense of self and those directed at acquiring information about the world. We call attention to the fact that heterogeneous multisensory processes take place all along the neural hierarchy (even in forming ‘unisensory’ observations) and argue that more integration of these aspects, in theory and experiment, is required to obtain a more comprehensive understanding of multisensory brain function.
This article is part of the theme issue ‘Decision and control processes in multisensory perception’.
Keywords: Bayesian, perception, interoception, exteroception, body
1. More than multisensory
Imagine yourself sitting on the beach. Despite the busy scene, with some people playing beach-bats nearby, you discern the sound of a chirp and spot a green parrot sitting on a branch. Classical theories of multisensory processing inspect such cases and define the rules for integration (or segregation) of the audio-visual signals [1,2]. However, the full inference process is much more complex than typically assumed. We often take for granted a multitude of processes underlying our ability to sense (and make sense) of the external world. For example, for stable vision, the brain uses efferent information to cancel out bodily and ocular motion [3–6]. Moreover, internal sensory information conveyed from proprioception (e.g. where are my eyes looking?), vestibular (e.g. how is my head positioned?) and interoception (e.g. the sound is external and not my heartbeat) is seamlessly and implicitly taken into account to allow the explicit sensory experience of the world around us [7–9]. All our sensory experiences are in actuality based on both internal and external multisensory information.
2. The fallacy of ‘unisensory’ perception
Perception of objects and events in the environment is a vital skill for survival and everyday behaviour. To achieve this, the brain needs to perform a multitude of perceptual decisions, on an ongoing basis [10–13]. In doing so, it must filter the constant barrage of signals from our various senses and interpret them appropriately. A solid theoretical foundation, based on signal detection theory and Bayesian statistics, has underscored major advancement in our understanding of perceptual decision making in the brain in the past few decades [14–21]. The rationale behind this computational approach is that signals received from our senses are noisy and to interpret them the brain needs to solve a statistical optimization problem—specifically, to infer their most likely cause(s).
Armed with these powerful analytical tools, classic psychophysics experiments have been interpreted in terms of, and used to quantify, the brain's ability to measure a plethora of sensory and perceptual features [22–26]. Furthermore, Bayesian theories of perceptual inference explain how the brain integrates separate measurements from the individual sensory cues into a unified percept (multisensory integration) [27–33]. For experimental control, other factors, besides the feature(s) being measured, are typically held constant in laboratory conditions (e.g. eye fixation, synthetic stimuli with custom manipulations). Such measurements are often loosely termed ‘unisensory’ because they measure responses to stimuli experimentally controlled in one specific modality. However, it is imperative to note that our other senses can never be turned off.
Even when one's eyes are fixated, and the head is stationary, proprioceptive signals from the eye muscles and vestibular signals regarding head position continue to bombard the brain [8,9]. In complete darkness, the visual cortex remains highly active (and this activity even differs with eyes open or closed [34]). Interoceptive and tactile signals are always present [7]. Thus, ‘unisensory’ perception is a misnomer. In ‘unisensory’ experiments (or measurements), signals from the sense being probed are always received in combination with signals from the other senses and interpreted within that multisensory context. Thus, while we may decide to manipulate sensory signals and ask for decisions in one sensory domain in our experiments, we are always in practice impacting multiple sensory streams. Put simply, there is no, nor has there ever been, true unisensory stimulation.
3. A shortfall of the current Bayesian multisensory perspective
The Bayesian framework of perceptual inference has been incredibly valuable and influential for understanding multisensory integration. However, there is a gap between the way it is modelled and tested in the laboratory, and actual multisensory brain function. First, Bayesian multisensory integration models rely on receiving two (or more) ‘unisensory’ cues that undergo integration. But, as described above, there are no pure ‘unisensory’ cues in the brain. Rather, each unisensory cue exists within a multisensory context and therefore is itself an outcome of multisensory processing. This creates a problem of circularity.
Let us consider this from the perspective of building a Bayesian model of perceptual inference. The first step is to specify the ‘generative model’ [35] (figure 1)—a statistical description of how the sensory observations come about. In its simplest form, this comprises a stimulus (s) and the brain's measurement thereof (x, an ‘observation’). This represents the simplest ‘unisensory’ case (the black components in figure 1a). The arrow leading from s to x indicates that the observation depends on the stimulus. But, in reality, the observation also depends on all the other sensory systems (grey circles and arrows in figure 1). Bayesian theory has a way to deal with these ‘nuisance’ parameters (via a process called marginalization [36]). However, dimensionality of the problem increases with the number of parameters, and their values are usually unknown. Even just identifying all the influential parameters is intractable. So, in practice, these multisensory dependencies are largely ignored.
Figure 1.
Perception is inherently multisensory. (a) Black components present the simplest generative Bayesian model (‘unisensory’ condition). Parameters s and x represent the stimulus and observation, respectively. The stimulus (s) can be described by the probability distribution p(s). The observation is dependent on the stimulus (black arrow) and has a probability distribution p(x|s) reflecting sensory noise. The grey circles and arrows depict other modalities that influence the ‘unisensory’ observation, which are typically ignored. (b) A Bayesian causal inference model to determine whether an auditory and a visual measurement (xA and xV, respectively) were caused by one source (C = 1) or two separate sources (C = 2). Here too, the measurements are each influenced by other modalities (grey circles and arrows) typically not taken into account. (Online version in colour.)
Beyond the practical problem of needing to marginalize over all the senses and states, there is also a conceptual problem in that Bayesian multisensory models purport that some brain circuit receives ‘unisensory’ observations as input. For example, the well-known ‘causal inference’ model [2] (figure 1b) describes how best to decide whether two unisensory observations (e.g. audio, xA and visual, xV) were caused by one source (C = 1, bottom left panel), in which case they should be integrated, or two separate sources (C = 2, bottom right panel), in which case they should not be integrated. But these ‘unisensory’ observations are in fact each a result of multisensory processing. They depend on the states and input from other senses (grey circles and arrows in figure 1b), e.g. proprioceptive signals from the eyes and neck, vestibular signals from the head, and corollary discharge from intended movements, all strongly influence visual and auditory observations [5,6,37–41].
Moreover, sensory systems are not passive receivers. Rather, they are part of sensorimotor loops. Not only do the movements we make influence sensations, humans and animals actively move to sense the environment (active sensing) [42,43]. We move our eyes to probe the visual scene and move our hands to generate tactile input. This means that sensory signals can only be interpreted within a sensorimotor context, in addition to the mixing of signals from multiple senses. Thus, the observer must fundamentally determine whether each sensory observation is from an external (versus internal, self-generated) source to even be considered for integration. This creates a chicken-and-egg problem, in which multisensory integration and causal inference are needed to supply the inputs for multisensory integration and causal inference.
4. Mixed signals from the start
Mixing of multisensory signals begins very early in the neuronal hierarchy. Anatomically and physiologically, multisensory signals reach far into the sensory periphery. Even lower-level reflexes are altered by multisensory signals. The pupillary light reflex is larger for multisensory stimuli [44–46]. Thus, multisensory signals affect visual inputs even before they reach the retina. In the auditory and vestibular systems, efferent signals from the brain reach all the way to the sensory end organs [47–49]. Hence, even signals from the sensory end organs are not strictly ‘unisensory’.
Seminal work on multisensory integration in the superior colliculus exposed subcortical convergence of visual, auditory and somatosensory signals in single neurons [50]. Interestingly, some of these features are not present at birth, but rather develop with experience in a multisensory world [51]. However, not all multisensory function is learned. Certain aspects are innate and present irrespective of experience [51,52]. Also, some aspects, such as ‘superadditivity’ (when the response to the combined cues exceeds the sum of the responses to the single cues [53]), seem to be mainly a feature of subcortical multisensory processing, and less cortical [54].
Cortically, multisensory signals are seen already in early (once considered ‘unisensory’) areas [55–58]. These areas even have cross-modal interconnections [59–61]. As one moves up the cortical hierarchy, multisensory responses in single neurons become ubiquitous. Notably, however, the types of interactions are heterogeneous across (and even within) areas. For example, neuronal responses to visual and vestibular stimuli in extrastriate visual area MSTd (medial superior temporal dorsal part) reflect summation of signals in some (congruently tuned) neurons, but diminished sensitivity in other (oppositely tuned) neurons [62]. In the visual posterior Sylvian area, visual and vestibular responses in single neurons more commonly have opposite tuning [63]. In the posterior parietal area, ventral intraparietal heterogeneous visual and vestibular responses are seen, with strong decision-related activity [64–66]. Lastly, cross-modal (visual–vestibular) recalibration reveals very different phenomena across different multisensory areas [67,68]. During the same perceptual recalibration, ‘unisensory’ tuning (to the same cue) can shift in opposite directions in two different brain areas [68].
5. Multiple stages of heterogeneous multisensory processing
Multisensory processing therefore occurs all along the brain hierarchy, where each stage (and modality) requires specific and heterogeneous functions. Although ‘multisensory integration’ is often used as an umbrella term, we think that it should be reserved for the specific case in which two (or more) underlying cues' measurements of a stimulus are combined into a unified estimate [33]. For a general term, we prefer ‘multisensory processing’ (see table 1 for a list of common multisensory terms, and their meanings, as we use them here). Take for example a task of estimating the location in space of a brief auditory stimulus. An observer will need to take into account their current head position at the time of the stimulus. Hence, it is a multisensory process. But here, proprioceptive signals from the neck and body do not carry information regarding the event per se. Thus, they are not integrated together with the auditory measurement. Rather, the auditory measurement is interpreted within their context, to obtain an allocentric location estimate. Hence, we would not consider this ‘multisensory integration’ in the strict sense of the term (rather, this is a different type of multisensory processing).
Table 1.
Multisensory terms.
term | meaning |
---|---|
multisensory processing | a host of brain functions that deal with multiple sensory modalities' inputs and states—for a variety of different purposes, including integration, suppression, etc. |
multisensory integration | combining multiple sensory measurements from different modalities into a unified estimate of the stimulus |
multisensory suppression | the neuronal or behavioural responses to a stimulus are reduced by the presence of another stimulus from a different modality |
multisensory interactions (or dependencies) | when the response or measurement from one modality depends on the state (or input) of another |
multisensory causal inference | probabilistic inference regarding the origin(s) of the experienced stimuli |
multisensory recalibration | shifting the estimates of one or more sensory modalities in relation to each other or to the environment |
The classic case of multisensory causal inference (e.g. to determine whether or not an external visual and an auditory observation reflect the same source; figure 1b) lies toward the upper end of this computational hierarchy. Its inputs have presumably already undergone other basic multisensory processing to determine that they are both of external (versus internal) origin—e.g. the chirp and the parrot from the example at the opening of this article. If the sound resulted from one's own action, then it would be irrelevant to consider whether it came from the same source as the visual observation. Thus, fundamental multisensory processes need to take place before higher-level causal inference and/or multisensory integration are invoked (or at least be a part of these processes). This notion, that the brain's signals already from the periphery can only be understood within a multisensory sensorimotor context, amplifies many questions regarding multisensory processing. How does the brain ‘know’ at each stage how to handle these signals? How do these functions develop in the absence of pure unisensory signals? Does the process to determine whether stimuli are of external versus internal origin need to happen before causal inference (to attribute the stimuli to one or more sources) can be invoked? Or can these inferences happen in parallel?
6. A primary mission to dissociate self- versus externally generated signals
Perhaps the most fundamental distinction regarding sensory signals is whether these are internally generated by the organism itself or arise from the environment [69]. This inference is often taken for granted under most multisensory processing schemas, yet the process of delineation of the sensory signals from the organism or the outside world is computationally complex. One must remember that these sensory streams are constantly entwined and there is no known a priori labelling of internal versus external signals, but rather these must be learned and inferred from experience.
At any given moment, our sensory measurements concurrently reflect a mixture of internally generated signals, externally generated signals and cross-modal influences (black, red and grey arrows, respectively, in figure 2). Thus, returning to the beach example (from the opening of this article), fundamental to performing causal inference for the external signals (whether or not the visual and auditory observations both came from the parrot, figure 2), one must determine that these signals indeed come from the environment. How then, can this primary case of causal inference, enabling the delineation of the self and the world, come to be?
Figure 2.
Observations are driven by internally and externally generated signals. Sensory measurements (black circles) are dependent on internally generated signals (black arrows), externally generated signals (e.g., originating from the parrot; red arrows) and cross-modal influences (grey arrows).
7. Multisensory interactions bound in the bodily self
William James famously noted that our body, ‘is always there’ [70], denoting the primacy of the bodily self in experience. The basic embodied sense of selfhood is considered pre-reflexive (i.e. it does not require explicit awareness or attentional resources) and phenomenologically transparent [71,72]. Current accounts suggest that predictive processing of multisensory signals may underlie the formation of coherent bodily self models [73–75]. Such accounts thus propose that the self is formed through predictive models of the integration of exteroceptive [76–78], volitional [79–82] and interoceptive signals [83–87].
Empirical work has revealed that the bodily self is achieved and maintained via complex interactions among interoceptive and exteroceptive signals [76,88–91]. For example, body ownership, the experience of being in a physical body [71], has been shown to rely on interactions between multisensory signals. Experimentally, illusory body ownership can be induced when viewing touch on a rubber or virtual limb that is coupled with synchronous touch on one's unseen real limb [91,92]. In such cases, the brain is exposed to conflicting multisensory information (e.g. the rubber hand is not in the location of one's real hand) but the visual and tactile signals are synchronous, and the brain must resolve this conflict by making a causal inference regarding the most probable origins of the signals. Interestingly, in this case, the conflict is typically resolved in favour of the corresponding visual–tactile signals and the rubber hand is felt as belonging to the self, demonstrated by behavioural, physiological and brain responses [90,92–95]. Similarly, illusory ownership may be induced by cardio-visual synchrony [95,96]. Indeed, recent accounts have suggested that body ownership may be accounted for using causal inference principles [95,97–99]. According to this approach, based on rubber hand illusion experiments, body ownership is computed using a causal inference process, where if a common cause is inferred for visual, tactile and proprioceptive signals, the rubber hand is perceived as part of the self [97,98]. However, if the visuo-tactile stimulation is asynchronous, the rubber hand is rotated or distanced, this illusion dissipates. Thus, the brain combines prior expectations regarding the multisensory statistical probabilities with incoming sensory signals to decide if these should be bound together as occurrences happening to the ‘self’.
8. Sensorimotor expectations
Similarly, self-generated, volitional actions provide robust signals for segregation of the self from the environment [3,69,100]. Predictive accounts based on the ‘comparator model’ suggest that volitional actions produce forward models which in turn predict the afferent sensory signals arising from that action [101,102]. The predicted and afferent sensory signals are then compared and, if they match, the sensory effects can be attributed to the self. These are then often cancelled out at both the perceptual [103–105] and neural level ([106,107], but see [108]). A classic example of such predictive causal inference, first noted by von Helmholtz [3], is the cancellation of the effects of self-generated oculomotor and bodily motions on visual information [3,4]. While we make constant movements with our eyes, head and trunk, the resulting movement on the retinal surface is attenuated, affording us a stable perception of the world. Similar predictive mechanisms are thought to enable the segregation of exafferent (world-generated) versus reafferent (self-generated) sensory signals across numerous modalities. In humans, the sense of agency, our feeling of control over our actions, is similarly thought to arise from predictive processing of sensorimotor signals [80,92,93,109], allowing explicit segregation of the self from the world through volitional actions [72,110–113].
9. Interoceptive signals
In recent years, research regarding the impact of interoceptive information such as cardiac, respiratory and visceral signals on perception [84,114,115] and the construction of self models [83,85,89,116,117] has flourished. At the theoretical level, it has been proposed that predictive processing of interoceptive signals, critical for homeostasis, may serve as the basic scaffold for the embodied self [86,89,118–120]. Indeed, under normal conditions, interoceptive signals are supressed from explicit awareness [84,121,122] and have been shown to be modulated by experimentally induced changes in the bodily self [83,88,123]. This suggests that they are a fundamental aspect of the self model. In fact, some researchers have proposed that interoceptive signals driving homeostatic regulation may have a privileged status for maintaining an organism's stability and as such may serve as ‘first priors’ (fundamental rules guiding learning) [86]. It is likely that these signals undergo similar causal inference processes early in development, enabling the precise predictions required for homeostatic survival. Moreover, recent evidence has shown that active sampling of the external world by exteroceptive senses such as vision and touch are unconsciously timed to the cardiac cycle to allow better signal-to-noise differentiation [124,125].
Taken together, these perspectives point to predictive inference processes, integrating rich multisensory information, as the driving impetus for the formation of a sense of bodily self. As such, the self can be viewed as a spatio-temporal construct for which the brain has developed predictive capacities regarding sensory signals, based on learned models of their co-dependencies. This fundamental model can later serve to make the primary distinction between the self and the world, allowing more fine-grained inferences regarding the origins of sensory signals. But how are such models acquired?
10. Development of a self model
While the precise processes underlying the acquisition of a multisensory model of the self remain unknown, we can denote several types of mechanisms by which this is formed. Many aspects of multisensory interactions and integration are biologically hard-wired into the neural system. For example, Gibson and others have proposed that the infant brain is inherently multisensory [126,127]. Neuroanatomical structures, such as the superior colliculus, have converging inputs from visual, auditory and tactile modalities, allowing multisensory interactions at early stages of neural processing [50,128]. Similarly, reflexes, present in early postnatal development, attest to an innate link between different sensorimotor modalities [129]. Moreover, reflexes have been shown to be modulated by self in addition to external stimulation [130]. Despite these innate aspects, it is clear that building a proficient model of multisensory dependencies is experience-dependent [51,128], with increased refinement during development. Moreover, there is evidence that learning of a multisensory model underlying the self may even begin in utero [87,131,132].
This multisensory experience, however, is not passive but is driven by both instinctive and volitional actions causing sensory signals. Volitional actions, in particular, hold special potential for construction of the self model as they allow high-precision inference regarding the predicted consequences of actions. Indeed, many predictive coding accounts suggest that volitional actions have a central role in the acquisition of a generative model of the self [111,132–135], by allowing directed exploration of co-dependencies between sensory signals [136,137]. Thus, the bodily self may serve as a ‘first prior’ binding together multisensory signals and allowing a distinction between self-generated signals and those arising from the external world. Moreover, the organism needs to dynamically maintain (recalibrate) its multiple sensory systems on an ongoing basis [138–140]. Once these fundamental strata of (internal) multisensory processing are accomplished, the senses can be directed to extract information about the (external) world.
11. Loops of multisensory processing
The formation of an implicit self model, based on inferences from the acquired sensorimotor predictions, may then enable further forms of multisensory inference. First, learned multisensory dependencies may allow marginalization and differentiation of ‘unisensory’ signals. For example, once retinal movement related to self-generated motion is accounted for and intrinsically modelled, one may attribute other forms of retinal movement as relating to movement in the external world. Across all sensory modalities and interactions, the subtraction of predicted sensory signals arising from intrinsic correspondences and actions (self model) allows distillation of specific sensory streams. Over time, this process could allow the differentiation of ‘unisensory’ signals for perceiving the external world. Indeed, our experience of bodily sensations (e.g. proprioception, interoception and vestibular) are typically not the focus of our explicit awareness [141–144], which is directed toward the external world. Correspondingly, this transparent phenomenology causes semantic representations of these internal, self-related signals to be severely impoverished in comparison with exteroceptive sensory signals. Thus, in any given moment (figure 2), our explicit awareness will typically be focused on an external object in the world (I see a parrot) while the multitude of sensory interactions enabling this sensation are bound under an implicit model simply denoted as 'I'. The attenuation of the self model is consistently found in both phenomenology and the brain [84,104,106,107,145], allowing enhanced perception of external stimuli.
12. Future outlook
In summary, the common assumption that the brain receives unisensory signals from the environment is false. Accordingly, the questions of causal inference and multisensory integration of these signals are only the ‘tip of the iceberg’. Below, supporting these functions, lies a complex set of inherently multisensory processes. First, the organism needs to form a model of self, to be able to fundamentally determine which stimuli are externally versus internally generated. This model is acquired through learned multisensory and sensorimotor dependencies, and built on a neural substrate with the innate potential to make these connections. Then, only after early multisensory processing to determine external (versus internal) origins, to distil the signal from one's own actions, and to marginalize over other sensory states, do classic causal inference and multisensory integration of signals from the environment become relevant.
Some experiments and modelling may still need to assume unisensory inputs. And of course, such studies have been paramount to building our foundational understanding of multisensory processing. Thus, our point is not to criticize this practice. Rather, we aim to highlight the weight (and limitations) of the assumptions. And, to call attention to the multisensory processes that take place all along the neural hierarchy, often ignored in modern multisensory modelling. Multisensory processing is not just a high-level function. It is fundamental to being. Thus, the scientific questions regarding multisensory brain function have perhaps been largely understated, and their scope underestimated.
Greater appreciation of this might open up new avenues for research. First, by connecting between researchers across different disciplines, this will force the refinement of semantics (sometimes the same term has different meanings, or the same phenomenology or computation has different terms) and the broadening of horizons to the full scope of the multisensory problem. It may also help elucidate open questions in the field, such as ‘opposite’ (versus ‘congruent’) multisensory tuning, and contrary neuronal recalibration in different brain areas [68]. Internal models built on our own bodily experience are also used to perceive other people in the world [146,147]. Thus, how causal inference to determine internal versus external stimulus origin is used or influenced by this function is an open question. Lastly, many disorders of brain function have been linked to aberrant ‘unisensory’ or multisensory function or causal inference, e.g. autism [148–151], Parkinson's disease [152,153], schizophrenia [109,154] and many more. One therefore needs to be cognizant of the fact that all ‘unisensory’ performance includes a measure of multisensory processing, and that testing a specific multisensory function is just one aspect amongst a broad range of heterogeneous functions. Better integration (on our behalf) of these disparate aspects of multisensory processing is essential for a comprehensive understanding of the brain, in health and disease.
Contributor Information
Adam Zaidel, Email: adam.zaidel@biu.ac.il.
Roy Salomon, Email: royesal@gmail.com.
Data accessibility
This article has no additional data.
Authors' contributions
A.Z.: conceptualization, funding acquisition, writing—original draft, writing—review and editing; R.S.: conceptualization, funding acquisition, writing—original draft, writing—review and editing.
Both authors gave final approval for publication and agreed to be held accountable for the work performed herein.
Conflict of interests
We declare we have no competing interests.
Funding
This research was supported by the Israel Science Foundation (ISF) grants nos 1291/20 and 3318/20 to A.Z. and ISF grant no. 1169/17 to R.S.
References
- 1.Battaglia PW, Jacobs RA, Aslin RN. 2003. Bayesian integration of visual and auditory signals for spatial localization. J. Opt. Soc. Am. A 20, 1391-1397. ( 10.1364/JOSAA.20.001391) [DOI] [PubMed] [Google Scholar]
- 2.Kording KP, Beierholm U, Ma WJ, Quartz S, Tenenbaum JB, Shams L. 2007. Causal inference in multisensory perception. PLoS ONE 2, e943. ( 10.1371/journal.pone.0000943) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.von Helmholtz H. 1867. Handbuch der physiologischen Optik [Treatise on physiological optics]. Leipzig, Germany: Voss. [In German.] [Google Scholar]
- 4.Sperry RW. 1950. Neural basis of the spontaneous optokinetic response produced by visual inversion. J. Comp. Physiol. Psychol. 43, 482-489. ( 10.1037/h0055479) [DOI] [PubMed] [Google Scholar]
- 5.Matin L, Picoult E, Stevens JK, Edwards MW Jr, Young D, MacArthur R. 1982. Oculoparalytic illusion: visual-field dependent spatial mislocalizations by humans partially paralyzed with curare. Science 216, 198-201. ( 10.1126/science.7063881) [DOI] [PubMed] [Google Scholar]
- 6.Guthrie BL, Porter JD, Sparks DL. 1983. Corollary discharge provides accurate eye position information to the oculomotor system. Science 221, 1193-1195. ( 10.1126/science.6612334) [DOI] [PubMed] [Google Scholar]
- 7.Chen WG, et al. 2021. the emerging science of interoception: sensing, integrating, interpreting, and regulating signals within the self. Trends Neurosci. 44, 3-16. ( 10.1016/j.tins.2020.10.007) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Day BL, Fitzpatrick RC. 2005. The vestibular system. Curr. Biol. 15, R583-R586. ( 10.1016/j.cub.2005.07.053) [DOI] [PubMed] [Google Scholar]
- 9.Krauzlis RJ, Goffart L, Hafed ZM. 2017. Neuronal control of fixation and fixational eye movements. Phil. Trans. R. Soc. B 372, 20160205. ( 10.1098/rstb.2016.0205) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Hanks TD, Summerfield C. 2017. Perceptual decision making in rodents, monkeys, and humans. Neuron 93, 15-31. ( 10.1016/j.neuron.2016.12.003) [DOI] [PubMed] [Google Scholar]
- 11.O'Connell RG, Shadlen MN, Wong-Lin K, Kelly SP. 2018. Bridging neural and computational viewpoints on perceptual decision-making. Trends Neurosci. 41, 838-852. ( 10.1016/j.tins.2018.06.005) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Najafi F, Churchland AK. 2018. Perceptual decision-making: a field in the midst of a transformation. Neuron 100, 453-462. ( 10.1016/j.neuron.2018.10.017) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.O'Connell RG, Kelly SP. 2021. Neurophysiology of human perceptual decision-making. Annu. Rev. Neurosci. 44, 495-516. ( 10.1146/annurev-neuro-092019-100200) [DOI] [PubMed] [Google Scholar]
- 14.Tanner WP Jr, Swets JA. 1954. A decision-making theory of visual detection. Psychol. Rev. 61, 401-409. ( 10.1037/h0058700) [DOI] [PubMed] [Google Scholar]
- 15.Green DM, Swets JA. 1966. Signal detection theory and psychophysics. New York, NY: Wiley. [Google Scholar]
- 16.Knill D, Richards W. 1996. Perception as Bayesian inference. Cambridge, UK: Cambridge University Press. [Google Scholar]
- 17.Kersten D, Mamassian P, Yuille A. 2004. Object perception as Bayesian inference. Annu. Rev. Psychol. 55, 271-304. ( 10.1146/annurev.psych.55.090902.142005) [DOI] [PubMed] [Google Scholar]
- 18.Ma WJ, Beck JM, Latham PE, Pouget A. 2006. Bayesian inference with probabilistic population codes. Nat. Neurosci. 9, 1432-1438. ( 10.1038/nn1790) [DOI] [PubMed] [Google Scholar]
- 19.Beck JM, Ma WJ, Kiani R, Hanks T, Churchland AK, Roitman J, Shadlen MN, Latham PE, Pouget A. 2008. Probabilistic population codes for Bayesian decision making. Neuron 60, 1142-1152. ( 10.1016/j.neuron.2008.09.021) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Friston K, Kiebel S. 2009. Predictive coding under the free-energy principle. Phil. Trans. R. Soc. B 364, 1211-1221. ( 10.1098/rstb.2008.0300) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Swets JA. 2014. Signal detection theory and ROC analysis in psychology and diagnostics: collected papers. New York, NY: Psychology Press. [Google Scholar]
- 22.Hubel DH, Wiesel TN. 1962. Receptive fields, binocular interaction and functional architecture in the cat's visual cortex. J. Physiol. 160, 106-154. ( 10.1113/jphysiol.1962.sp006837) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Mountcastle VB. 1957. Modality and topographic properties of single neurons of cat's somatic sensory cortex. J. Neurophysiol. 20, 408-434. ( 10.1152/jn.1957.20.4.408) [DOI] [PubMed] [Google Scholar]
- 24.Geisler WS. 1989. Sequential ideal-observer analysis of visual discriminations. Psychol. Rev. 96, 267-314. ( 10.1037/0033-295X.96.2.267) [DOI] [PubMed] [Google Scholar]
- 25.Stocker AA, Simoncelli EP. 2006. Noise characteristics and prior expectations in human visual speed perception. Nat. Neurosci. 9, 578-585. ( 10.1038/nn1669) [DOI] [PubMed] [Google Scholar]
- 26.Moore BCJ. 2012. An introduction to the psychology of hearing. London, UK: Brill. [Google Scholar]
- 27.Yuille AL, Bülthoff HH. 1996. Bayesian decision theory and psychophysics. In Perception as Bayesian inference (eds DC Knill, W Richards), pp. 123–161. Cambridge, UK: Cambridge University Press.
- 28.Jacobs RA. 1999. Optimal integration of texture and motion cues to depth. Vision Res. 39, 3621-3629. ( 10.1016/S0042-6989(99)00088-7) [DOI] [PubMed] [Google Scholar]
- 29.Landy MS, Kojima H. 2001. Ideal cue combination for localizing texture-defined edges. J. Opt. Soc. Am. A 18, 2307-2320. ( 10.1364/josaa.18.002307) [DOI] [PubMed] [Google Scholar]
- 30.Ernst MO, Banks MS. 2002. Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415, 429-433. ( 10.1038/415429a) [DOI] [PubMed] [Google Scholar]
- 31.van Beers RJ, Wolpert DM, Haggard P. 2002. When feeling is more important than seeing in sensorimotor adaptation. Curr. Biol. 12, 834-837. ( 10.1016/S0960-9822(02)00836-9) [DOI] [PubMed] [Google Scholar]
- 32.Alais D, Burr D. 2004. The ventriloquist effect results from near-optimal bimodal integration. Curr. Biol. 14, 257-262. ( 10.1016/j.cub.2004.01.029) [DOI] [PubMed] [Google Scholar]
- 33.Angelaki DE, Gu Y, DeAngelis GC. 2009. Multisensory integration: psychophysics, neurophysiology, and computation. Curr. Opin. Neurobiol. 19, 452-458. ( 10.1016/j.conb.2009.06.008) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Marx E, Deutschländer A, Stephan T, Dieterich M, Wiesmann M, Brandt T. 2004. Eyes open and eyes closed as rest conditions: impact on brain activation patterns. Neuroimage 21, 1818-1824. ( 10.1016/j.neuroimage.2003.12.026) [DOI] [PubMed] [Google Scholar]
- 35.Ma WJ. 2019. Bayesian decision models: a primer. Neuron 104, 164-175. ( 10.1016/j.neuron.2019.09.037) [DOI] [PubMed] [Google Scholar]
- 36.Ma WJ, Kording KP, Goldreich D.. 2023. Bayesian models of perception and action. Cambridge, MA: MIT Press. See https://mitpress.mit.edu/9780262047593/bayesian-models-of-perception-and-action/. [Google Scholar]
- 37.Roll JP, Roll R. 1987. Kinaesthetic and motor effects of extraocular muscle vibration in man. In Eye movements from physiology to cognition (eds O'Regan JK, Levy-Schoen A), pp. 57-68. Amsterdam, The Netherlands: Elsevier. ( 10.1016/B978-0-444-70113-8.50010-2) [DOI] [Google Scholar]
- 38.Steinbach MJ. 1987. Proprioceptive knowledge of eye position. Vision Res. 27, 1737-1744. ( 10.1016/0042-6989(87)90103-9) [DOI] [PubMed] [Google Scholar]
- 39.Biguer B, Donaldson IML, Hein A, Jeannerod M. 1988. Neck muscle vibration modifies the representation of visual motion and direction in man. Brain 111, 1405-1424. ( 10.1093/brain/111.6.1405) [DOI] [PubMed] [Google Scholar]
- 40.Lewald J, Karnath HO, Ehrenstein WH. 1999. Neck-proprioceptive influence on auditory lateralization. Exp. Brain Res. 125, 389-396. ( 10.1007/s002210050695) [DOI] [PubMed] [Google Scholar]
- 41.Genzel D, Firzlaff U, Wiegrebe L, MacNeilage PR. 2016. Dependence of auditory spatial updating on vestibular, proprioceptive, and efference copy signals. J. Neurophysiol. 116, 765-775. ( 10.1152/jn.00052.2016) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Kleinfeld D, Ahissar E, Diamond ME. 2006. Active sensation: insights from the rodent vibrissa sensorimotor system. Curr. Opin. Neurobiol. 16, 435-444. ( 10.1016/j.conb.2006.06.009) [DOI] [PubMed] [Google Scholar]
- 43.Schroeder CE, Wilson DA, Radman T, Scharfman H, Lakatos P. 2010. Dynamics of active sensing and perceptual selection. Curr. Opin. Neurobiol. 20, 172-176. ( 10.1016/j.conb.2010.02.010) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Wang CA, Boehnke SE, Itti L, Munoz DP. 2014. Transient pupil response is modulated by contrast-based saliency. J. Neurosci. 34, 408-417. ( 10.1523/JNEUROSCI.3550-13.2014) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Rigato S, Rieger G, Romei V. 2016. Multisensory signalling enhances pupil dilation. Scient. Rep. 6, 26188. ( 10.1038/srep26188) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Wang CA, Blohm G, Huang J, Boehnke SE, Munoz DP. 2017. Multisensory integration in orienting behavior: pupil size, microsaccades, and saccades. Biol. Psychol. 129, 36-44. ( 10.1016/j.biopsycho.2017.07.024) [DOI] [PubMed] [Google Scholar]
- 47.Guinan JJ Jr. 2006. Olivocochlear efferents: anatomy, physiology, function, and the measurement of efferent effects in humans. Ear Hear. 27, 589-607. ( 10.1097/01.aud.0000240507.83072.e7) [DOI] [PubMed] [Google Scholar]
- 48.Meredith GE. 1988. Comparative view of the central organization of afferent and efferent circuitry for the inner ear. Acta Biol. Hung. 39, 229-249. [PubMed] [Google Scholar]
- 49.Holt JC, Lysakowski A, Goldberg JM. 2011. The efferent vestibular system. In Auditory and vestibular efferents (eds DK Ryugo, Fay RR), pp. 135-186. New York, NY: Springer. ( 10.1007/978-1-4419-7070-1_6) [DOI] [Google Scholar]
- 50.Meredith MA, Stein BE. 1986. Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. J. Neurophysiol. 56, 640-662. ( 10.1152/jn.1986.56.3.640) [DOI] [PubMed] [Google Scholar]
- 51.Stein BE, Stanford TR, Rowland BA. 2014. Development of multisensory integration from the perspective of the individual neuron. Nat. Rev. Neurosci. 15, 520-535. ( 10.1038/nrn3742) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Wallace MT, Stein BE. 2001. Sensory and multisensory responses in the newborn monkey superior colliculus. J. Neurosci. 21, 8886-8894. ( 10.1523/JNEUROSCI.21-22-08886.2001) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Stein BE, Meredith MA. 1993. The merging of the senses. Cambridge, MA: MIT Press. [Google Scholar]
- 54.Alais D, Newell F, Mamassian P. 2010. Multisensory processing in review: from physiology to behaviour. Seeing Perceiving 23, 3-38. ( 10.1163/187847510X488603) [DOI] [PubMed] [Google Scholar]
- 55.Driver J, Noesselt T. 2008. Multisensory interplay reveals crossmodal influences on ‘sensory-specific’ brain regions, neural responses, and judgments. Neuron 57, 11-23. ( 10.1016/j.neuron.2007.12.013) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Foxe JJ, Wylie GR, Martinez A, Schroeder CE, Javitt DC, Guilfoyle D, Ritter W, Murray MM. 2002. Auditory-somatosensory multisensory processing in auditory association cortex: an fMRI study. J. Neurophysiol. 88, 540-543. ( 10.1152/jn.2002.88.1.540) [DOI] [PubMed] [Google Scholar]
- 57.Fu KMG, Johnston TA, Shah AS, Arnold L, Smiley J, Hackett TA, Garraghty PE, Schroeder CE. 2003. Auditory cortical neurons respond to somatosensory stimulation. J. Neurosci. 23, 7510-7515. ( 10.1523/JNEUROSCI.23-20-07510.2003) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Sathian K, Zangaladze A, Hoffman JM, Grafton ST. 1997. Feeling with the mind's eye. Neuroreport 8, 3877-3881. ( 10.1097/00001756-199712220-00008) [DOI] [PubMed] [Google Scholar]
- 59.Falchier A, Clavagnier S, Barone P, Kennedy H. 2002. Anatomical evidence of multimodal integration in primate striate cortex. J. Neurosci. 22, 5749-5759. ( 10.1523/JNEUROSCI.22-13-05749.2002) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Clavagnier S, Falchier A, Kennedy H. 2004. Long-distance feedback projections to area V1: implications for multisensory integration, spatial awareness, and visual consciousness. Cogn. Affect. Behav. Neurosci. 4, 117-126. ( 10.3758/cabn.4.2.117) [DOI] [PubMed] [Google Scholar]
- 61.Cappe C, Barone P. 2005. Heteromodal connections supporting multisensory integration at low levels of cortical processing in the monkey. Eur. J. Neurosci. 22, 2886-2902. ( 10.1111/j.1460-9568.2005.04462.x) [DOI] [PubMed] [Google Scholar]
- 62.Gu Y, Angelaki DE, DeAngelis GC. 2008. Neural correlates of multisensory cue integration in macaque MSTd. Nat. Neurosci. 11, 1201-1210. ( 10.1038/nn.2191) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Chen A, DeAngelis GC, Angelaki DE. 2011. Convergence of vestibular and visual self-motion signals in an area of the posterior Sylvian fissure. J. Neurosci. 31, 11 617-11 627. ( 10.1523/JNEUROSCI.1266-11.2011) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 64.Bremmer F, Klam F, Duhamel JR, Ben Hamed S, Graf W. 2002. Visual–vestibular interactive responses in the macaque ventral intraparietal area (VIP). Eur. J. Neurosci. 16, 1569-1586. ( 10.1046/j.1460-9568.2002.02206.x) [DOI] [PubMed] [Google Scholar]
- 65.Chen A, DeAngelis GC, Angelaki DE. 2011. Representation of vestibular and visual cues to self-motion in ventral intraparietal cortex. J. Neurosci. 31, 12 036-12 052. ( 10.1523/JNEUROSCI.0395-11.2011) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Zaidel A, DeAngelis GC, Angelaki DE. 2017. Decoupled choice-driven and stimulus-related activity in parietal neurons may be misrepresented by choice probabilities. Nat. Commun. 8, 715. ( 10.1038/s41467-017-00766-3) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.Zaidel A, Laurens J, DeAngelis GC, Angelaki DE. 2021. Supervised multisensory calibration signals are evident in VIP but not MSTd. J. Neurosci. 41, 10 108-10 119. ( 10.1523/JNEUROSCI.0135-21.2021) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 68.Zeng F, Zaidel A, Chen A. 2022. Contrary neuronal recalibration in different multisensory cortical areas. eLife 12, e82895. ( 10.1101/2022.09.26.509476) [DOI]
- 69.von Holst E, Mittelstaedt H. 1950. Das Reafferenzprinzip [The reafference principle]. Naturwissenschaften 37, 464-476. ( 10.1007/BF00622503) [In German.] [DOI] [Google Scholar]
- 70.James W. 1890. The principles of psychology, vol. 1. New York, NY: Henry Holt and Co. ( 10.1037/10538-000) [DOI] [Google Scholar]
- 71.De Vignemont F. 2018. Mind the body: an exploration of bodily self-awareness. Oxford, UK: Oxford University Press. [Google Scholar]
- 72.Gallagher S. 2000. Philosophical conceptions of the self: implications for cognitive science. Trends Cogn. Sci. 4, 14-21. ( 10.1016/S1364-6613(99)01417-5) [DOI] [PubMed] [Google Scholar]
- 73.Clark A. 2013. Whatever next? Predictive brains, situated agents, and the future of cognitive science. Behav. Brain Sci. 36, 181-204. ( 10.1017/S0140525X12000477) [DOI] [PubMed] [Google Scholar]
- 74.Limanowski J, Blankenburg F. 2013. Minimal self-models and the free energy principle. Front. Hum. Neurosci. 7, 547. ( 10.3389/fnhum.2013.00547) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Apps MA, Tsakiris M. 2014. The free-energy self: a predictive coding account of self-recognition. Neurosci. Biobehav. Rev. 41, 85-97. ( 10.1016/j.neubiorev.2013.01.029) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 76.Blanke O, Slater M, Serino A. 2015. Behavioral, neural, and computational principles of bodily self-consciousness. Neuron 88, 145-166. ( 10.1016/j.neuron.2015.09.029) [DOI] [PubMed] [Google Scholar]
- 77.Salomon R. 2017. The assembly of the self from sensory and motor foundations. Social Cogn. 35, 87-106. ( 10.1521/soco.2017.35.2.87) [DOI] [Google Scholar]
- 78.Petkova VI, Björnsdotter M, Gentile G, Jonsson T, Li TQ, Ehrsson HH. 2011. From part-to whole-body ownership in the multisensory brain. Curr. Biol. 21, 1118-1122. ( 10.1016/j.cub.2011.05.022) [DOI] [PubMed] [Google Scholar]
- 79.Chambon V, Sidarus N, Haggard P. 2014. From action intentions to action effects: how does the sense of agency come about? Front. Hum. Neurosci. 8, 320. ( 10.3389/fnhum.2014.00320) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 80.Haggard P, Clark S, Kalogeras J. 2002. Voluntary action and conscious awareness. Nat. Neurosci. 5, 382-385. ( 10.1038/nn827) [DOI] [PubMed] [Google Scholar]
- 81.Stern Y, Koren D, Moebus R, Panishev G, Salomon R. 2020. Assessing the relationship between sense of agency, the bodily-self and stress: four virtual-reality experiments in healthy individuals. J. Clin. Med. 9, 2931. ( 10.3390/jcm9092931) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 82.de Vignemont F, Fourneret P. 2004. The sense of agency: a philosophical and empirical review of the ‘who’ system. Conscious. Cogn. 13, 1-19. ( 10.1016/S1053-8100(03)00022-9) [DOI] [PubMed] [Google Scholar]
- 83.Park HD, Bernasconi F, Bello-Ruiz J, Pfeiffer C, Salomon R, Blanke O. 2016. Transient modulations of neural responses to heartbeats covary with bodily self-consciousness. J. Neurosci. 36, 8453-8460. ( 10.1523/JNEUROSCI.0311-16.2016) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 84.Salomon R, Ronchi R, Dönz J, Bello-Ruiz J, Herbelin B, Martet R, Faivre N, Schaller K, Blanke O. 2016. The insula mediates access to awareness of visual stimuli presented synchronously to the heartbeat. J. Neurosci. 36, 5115-5127. ( 10.1523/JNEUROSCI.4262-15.2016) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 85.Seth AK, Suzuki K, Critchley HD. 2011. An interoceptive predictive coding model of conscious presence. Front. Psychol. 2, 395. ( 10.3389/fpsyg.2011.00395) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Allen M, Tsakiris M. 2019. The body as first prior: interoceptive predictive processing and the primacy. In The interoceptive mind: from homeostasis to awareness (eds M Tsakiris, H De Preester), pp. 27–45. Oxford, UK: Oxford University Press.
- 87.Ciaunica A, Constant A, Preissl H, Fotopoulou K. 2021. The first prior: from co-embodiment to co-homeostasis in early life. Conscious. Cogn. 91, 103117. ( 10.1016/j.concog.2021.103117) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 88.Park HD, Bernasconi F, Salomon R, Tallon-Baudry C, Spinelli L, Seeck M, Schaller K, Blanke O. 2018. Neural sources and underlying mechanisms of neural responses to heartbeats, and their role in bodily self-consciousness: an intracranial EEG study. Cereb. Cortex 28, 2351-2364. ( 10.1093/cercor/bhx136) [DOI] [PubMed] [Google Scholar]
- 89.Park HD, Blanke O. 2019. Coupling inner and outer body for self-consciousness. Trends Cogn. Sci. 23, 377-388. ( 10.1016/j.tics.2019.02.002) [DOI] [PubMed] [Google Scholar]
- 90.Salomon R, Lim M, Pfeiffer C, Gassert R, Blanke O. 2013. Full body illusion is associated with widespread skin temperature reduction. Front. Behav. Neurosci. 7, 65. ( 10.3389/fnbeh.2013.00065) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 91.Botvinick M, Cohen J. 1998. Rubber hands ‘feel’ touch that eyes see. Nature 391, 756. ( 10.1038/35784) [DOI] [PubMed] [Google Scholar]
- 92.Kalckert A, Ehrsson HH. 2012. Moving a rubber hand that feels like your own: a dissociation of ownership and agency. Front. Hum. Neurosci. 6, 40. ( 10.3389/fnhum.2012.00040) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 93.Harduf A, Shaked A, Yaniv AU, Salomon R. In press. Disentangling the neural correlates of agency, ownership and multisensory processing. NeuroImage 120255. ( 10.1016/j.neuroimage.2023.120255) [DOI] [PubMed]
- 94.Ehrsson HH, Chancel M. 2019. Premotor cortex implements causal inference in multisensory own-body perception. Proc. Natl Acad. Sci. USA 116, 19 771-19 773. ( 10.1073/pnas.1914000116) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 95.Suzuki K, Garfinkel SN, Critchley HD, Seth AK. 2013. Multisensory integration across exteroceptive and interoceptive domains modulates self-experience in the rubber-hand illusion. Neuropsychologia 51, 2909-2917. ( 10.1016/j.neuropsychologia.2013.08.014) [DOI] [PubMed] [Google Scholar]
- 96.Aspell JE, Heydrich L, Marillier G, Lavanchy T, Herbelin B, Blanke O. 2013. Turning body and self inside out visualized heartbeats alter bodily self-consciousness and tactile perception. Psychol. Sci. 24, 2445-2453. ( 10.1177/0956797613498395) [DOI] [PubMed] [Google Scholar]
- 97.Chancel M, Ehrsson HH, Ma WJ. 2022. Uncertainty-based inference of a common cause for body ownership. eLife 11, e77221. ( 10.7554/eLife.77221) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 98.Samad M, Chung AJ, Shams L. 2015. Perception of body ownership is driven by Bayesian sensory inference. PLoS ONE 10, e0117178. ( 10.1371/journal.pone.0117178) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 99.Fang W, Li J, Qi G, Li S, Sigman M, Wang L. 2019. Statistical inference of body representation in the macaque brain. Proc. Natl Acad. Sci. USA 116, 20 151-20 157. ( 10.1073/pnas.1902334116) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 100.Crapse TB, Sommer MA. 2008. Corollary discharge across the animal kingdom. Nat. Rev. Neurosci. 9, 587-600. ( 10.1038/nrn2457) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.Wolpert DM, Ghahramani Z, Jordan MI. 1995. An internal model for sensorimotor integration. Science 269, 1880. ( 10.1126/science.7569931) [DOI] [PubMed] [Google Scholar]
- 102.Blakemore S, Wolpert D, Frith C. 2002. Abnormalities in the awareness of action. Trends Cogn. Sci. 6, 237-242. ( 10.1016/S1364-6613(02)01907-1) [DOI] [PubMed] [Google Scholar]
- 103.Bays P, Wolpert D, Flanagan J. 2005. Perception of the consequences of self-action is temporally tuned and event driven. Curr. Biol. 15, 1125-1128. ( 10.1016/j.cub.2005.05.023) [DOI] [PubMed] [Google Scholar]
- 104.Kilteni K, Engeler P, Ehrsson HH. 2020. Efference copy is necessary for the attenuation of self-generated touch. iScience 23, 100843. ( 10.1016/j.isci.2020.100843) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 105.Blakemore S, Wolpert D, Frith C. 2000. Why can't you tickle yourself? Neuroreport 11, R11. ( 10.1097/00001756-200008030-00002) [DOI] [PubMed] [Google Scholar]
- 106.van Elk M, Salomon R, Kannape O, Blanke O. 2014. Suppression of the N1 auditory evoked potential for sounds generated by the upper and lower limbs. Biol. Psychol. 102, 108-117. ( 10.1016/j.biopsycho.2014.06.007) [DOI] [PubMed] [Google Scholar]
- 107.Shergill SS, White TP, Joyce DW, Bays PM, Wolpert DM, Frith CD. 2012. Modulation of somatosensory processing by action. Neuroimage 70, 356-362. ( 10.1016/j.neuroimage.2012.12.043) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 108.Press C, Thomas E, Yon D. 2022. Cancelling cancellation? Sensorimotor control, agency, and prediction. Neurosci. Biobehav. Rev. 145, 105012. ( 10.1016/j.neubiorev.2022.105012) [DOI] [PubMed] [Google Scholar]
- 109.Krugwasser AR, Stern Y, Faivre N, Harel EV, Salomon R. 2021. Impaired sense of agency and associated confidence in psychosis. Schizophrenia 8, 32. ( 10.1038/s41537-022-00212-4) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 110.Stern Y, Ben-Yehuda I, Koren D, Zaidel A, Salomon R. 2022. The dynamic boundaries of the self: serial dependence in embodied sense of agency. Cortex 152, 109–121. ( 10.1016/j.cortex.2022.03.015) [DOI] [PubMed]
- 111.Krugwasser AR, Harel EV, Salomon R. 2019. The boundaries of the self: the sense of agency across different sensorimotor aspects. J. Vis. 19, 14. ( 10.1167/19.4.14) [DOI] [PubMed] [Google Scholar]
- 112.Constant M, Salomon R, Filevich E. 2022. Judgments of agency are affected by sensory noise without recruiting metacognitive processing. eLife 11, e72356. ( 10.7554/eLife.72356) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 113.Hohwy J. 2007. The sense of self in the phenomenology of agency and perception. Psyche 13, 1-20. [Google Scholar]
- 114.Garfinkel SN, Seth AK, Barrett AB, Suzuki K, Critchley HD. 2015. Knowing your own heart: distinguishing interoceptive accuracy from interoceptive awareness. Biol. Psychol. 104, 65-74. ( 10.1016/j.biopsycho.2014.11.004) [DOI] [PubMed] [Google Scholar]
- 115.Gray MA, Rylander K, Harrison NA, Wallin BG, Critchley HD. 2009. Following one's heart: cardiac rhythms gate central initiation of sympathetic reflexes. J. Neurosci. 29, 1817-1825. ( 10.1523/JNEUROSCI.3363-08.2009) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 116.Craig AD. 2002. How do you feel? Interoception: the sense of the physiological condition of the body. Nat. Rev. Neurosci. 3, 655-666. ( 10.1038/nrn894) [DOI] [PubMed] [Google Scholar]
- 117.Seth AK, Tsakiris M. 2018. Being a beast machine: the somatic basis of selfhood. Trends Cogn. Sci. 22, 969-981. ( 10.1016/j.tics.2018.08.008) [DOI] [PubMed] [Google Scholar]
- 118.Seth AK. 2013. Interoceptive inference, emotion, and the embodied self. Trends Cogn. Sci. 17, 565-573. ( 10.1016/j.tics.2013.09.007) [DOI] [PubMed] [Google Scholar]
- 119.Barrett LF, Simmons WK. 2015. Interoceptive predictions in the brain. Nat. Rev. Neurosci. 16, 419-429. ( 10.1038/nrn3950) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 120.Ainley V, Apps MA, Fotopoulou A, Tsakiris M. 2016. ‘Bodily precision’: a predictive coding account of individual differences in interoceptive accuracy. Phil. Trans. R. Soc. B 371, 20160003. ( 10.1098/rstb.2016.0003) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 121.Salomon R, Ronchi R, Dönz J, Bello-Ruiz J, Herbelin B, Faivre N, Schaller K, Blanke O. 2018. Insula mediates heartbeat related effects on visual consciousness. Cortex 101, 87-95. ( 10.1016/j.cortex.2018.01.005) [DOI] [PubMed] [Google Scholar]
- 122.Van Elk M, Lenggenhager B, Heydrich L, Blanke O. 2014. Suppression of the auditory N1-component for heartbeat-related sounds reflects interoceptive predictive coding. Biol. Psychol. 99, 172-182. ( 10.1016/j.biopsycho.2014.03.004) [DOI] [PubMed] [Google Scholar]
- 123.Sel A, Azevedo RT, Tsakiris M. 2016. Heartfelt self: cardio-visual integration affects self-face recognition and interoceptive cortical processing. Cereb. Cortex 27, 5144-5145. ( 10.1093/cercor/bhw296) [DOI] [PubMed] [Google Scholar]
- 124.Galvez-Pol A, McConnell R, Kilner JM. 2020. Active sampling in visual search is coupled to the cardiac cycle. Cognition 196, 104149. ( 10.1016/j.cognition.2019.104149) [DOI] [PubMed] [Google Scholar]
- 125.Galvez-Pol A, Virdee P, Villacampa J, Kilner J. 2022. Active tactile discrimination is coupled with and modulated by the cardiac cycle. eLife 11, e78126. ( 10.7554/eLife.78126) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 126.Gibson JJ, Carmichael L. 1966. The senses considered as perceptual systems. Oxford, UK: Houghton Mifflin. [Google Scholar]
- 127.Lewkowicz DJ. 2000. The development of intersensory temporal perception: an epigenetic systems/limitations view. Psychol. Bull. 126, 281. ( 10.1037/0033-2909.126.2.281) [DOI] [PubMed] [Google Scholar]
- 128.Wallace MT. 2004. The development of multisensory processes. Cogn. Process. 5, 69-83. ( 10.1007/s10339-004-0017-z) [DOI] [Google Scholar]
- 129.Zafeiriou DI. 2004. Primitive reflexes and postural reactions in the neurodevelopmental examination. Pediatr. Neurol. 31, 1-8. ( 10.1016/j.pediatrneurol.2004.01.012) [DOI] [PubMed] [Google Scholar]
- 130.Rochat P, Hespos SJ. 1997. Differential rooting response by neonates: evidence for an early sense of self. Early Dev. Parenting 6, 105-112. () [DOI] [Google Scholar]
- 131.Ciaunica A, Crucianelli L. 2019. Minimal self-awareness: from within a developmental perspective. J. Conscious. Stud. 26, 207-226. [Google Scholar]
- 132.Delafield-Butt JT, Gangopadhyay N. 2013. Sensorimotor intentionality: the origins of intentionality in prospective agent action. Dev. Rev. 33, 399-425. ( 10.1016/j.dr.2013.09.001) [DOI] [Google Scholar]
- 133.Adams RA, Shipp S, Friston KJ. 2013. Predictions not commands: active inference in the motor system. Brain Struct. Funct. 218, 611-643. ( 10.1007/s00429-012-0475-5) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 134.Friston K. 2018. Does predictive coding have a future? Nat. Neurosci. 21, 1019. ( 10.1038/s41593-018-0200-7) [DOI] [PubMed] [Google Scholar]
- 135.Allen M, Friston KJ. 2016. From cognitivism to autopoiesis: towards a computational framework for the embodied mind. Synthese 195, 2459-2482. ( 10.1007/s11229-016-1288-5) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 136.Kanazawa H, Yamada Y, Tanaka K, Kawai M, Niwa F, Iwanaga K, Kuniyoshi Y. 2023. Open-ended movements structure sensorimotor information in early human development. Proc. Natl Acad. Sci. USA 120, e2209953120. ( 10.1073/pnas.2209953120) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 137.Adolph KE, Hoch JE. 2019. Motor development: embodied, embedded, enculturated, and enabling. Annu. Rev. Psychol. 70, 141-164. ( 10.1146/annurev-psych-010418-102836) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 138.Zaidel A, Turner AH, Angelaki DE. 2011. Multisensory calibration is independent of cue reliability. J. Neurosci. 31, 13 949-13 962. ( 10.1523/JNEUROSCI.2732-11.2011) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 139.Zaidel A, Ma WJ, Angelaki DE. 2013. Supervised calibration relies on the multisensory percept. Neuron 80, 1544-1557. ( 10.1016/j.neuron.2013.09.026) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 140.Shalom-Sperber S, Chen A, Zaidel A. 2022. Rapid cross-sensory adaptation of self-motion perception. Cortex 148, 14-30. ( 10.1016/j.cortex.2021.11.018) [DOI] [PubMed] [Google Scholar]
- 141.Salomon R, Lim M, Herbelin B, Hesselmann G, Blanke O. 2013. Posing for awareness: proprioception modulates access to visual consciousness in a continuous flash suppression task. J. Vis. 13, 2. ( 10.1167/13.7.2) [DOI] [PubMed] [Google Scholar]
- 142.Salomon R, Kaliuzhna M, Herbelin B, Blanke O. 2015. Balancing awareness: vestibular signals modulate visual consciousness in the absence of awareness. Conscious. Cogn. 36, 289-297. ( 10.1016/j.concog.2015.07.009) [DOI] [PubMed] [Google Scholar]
- 143.Faivre N, Salomon R, Blanke O. 2015. Visual consciousness and bodily self-consciousness. Curr. Opin. Neurol. 28, 23-28. ( 10.1097/WCO.0000000000000160) [DOI] [PubMed] [Google Scholar]
- 144.Faivre N, Arzi A, Lunghi C, Salomon R. 2017. Consciousness is more than meets the eye: a call for a multisensory study of subjective experience. Neurosci. Conscious. 2017, nix003. ( 10.1093/nc/nix003) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 145.Weiss C, Herwig A, Schütz-Bosbach S. 2011. The self in action effects: selective attenuation of self-generated sounds. Cognition 121, 207-218. ( 10.1016/j.cognition.2011.06.011) [DOI] [PubMed] [Google Scholar]
- 146.Hardwick RM, Caspers S, Eickhoff SB, Swinnen SP. 2018. Neural correlates of action: comparing meta-analyses of imagery, observation, and execution. Neurosci. Biobehav. Rev. 94, 31-44. ( 10.1016/j.neubiorev.2018.08.003) [DOI] [PubMed] [Google Scholar]
- 147.Galvez-Pol A, Forster B, Calvo-Merino B. 2020. Beyond action observation: neurobehavioral mechanisms of memory for visually perceived bodies and actions. Neurosci. Biobehav. Rev. 116, 508-518. ( 10.1016/j.neubiorev.2020.06.014) [DOI] [PubMed] [Google Scholar]
- 148.Pellicano E, Burr D. 2012. When the world becomes ‘too real’: a Bayesian explanation of autistic perception. Trends Cogn. Sci. 16, 504-510. ( 10.1016/j.tics.2012.08.009) [DOI] [PubMed] [Google Scholar]
- 149.Zaidel A, Goin-Kochel RP, Angelaki DE. 2015. Self-motion perception in autism is compromised by visual noise but integrated optimally across multiple senses. Proc. Natl Acad. Sci. USA 112, 6461-6466. ( 10.1073/pnas.1506582112) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 150.Noel JP, Shivkumar S, Dokka K, Haefner RM, Angelaki DE. 2022. Aberrant causal inference and presence of a compensatory mechanism in autism spectrum disorder. eLife 11, e71866. ( 10.7554/eLife.71866) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 151.Feigin H, Shalom-Sperber S, Zachor DA, Zaidel A. 2021. Increased influence of prior choices on perceptual decisions in autism. eLife 10, e61595. ( 10.7554/ELIFE.61595) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 152.Yakubovich S, Israeli-Korn S, Halperin O, Yahalom G, Hassin-Baer S, Zaidel A. 2020. Visual self-motion cues are impaired yet overweighted during visual–vestibular integration in Parkinson's disease. Brain Commun. 2, fcaa035. ( 10.1093/braincomms/fcaa035) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 153.Halperin O, Israeli-Korn S, Yakubovich S, Hassin-Baer S, Zaidel A. 2020. Self-motion perception in Parkinson's disease. Eur. J. Neurosci. 53, 2376-2387. ( 10.1111/ejn.14716) [DOI] [PubMed] [Google Scholar]
- 154.Salomon R, Kannape OA, Debarba HG, Kaliuzhna M, Schneider M, Faivre N, Eliez S, Blanke O. 2021. Agency deficits in a human genetic model of schizophrenia: insights from 22q11DS patients. Schizophr. Bull. 48, 495-504. ( 10.1093/schbul/sbab143) [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
This article has no additional data.