40
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Cortical Hierarchies Perform Bayesian Causal Inference in Multisensory Perception

      research-article
      1 , * , 1 , 2 , *
      PLoS Biology
      Public Library of Science

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          To form a veridical percept of the environment, the brain needs to integrate sensory signals from a common source but segregate those from independent sources. Thus, perception inherently relies on solving the “causal inference problem.” Behaviorally, humans solve this problem optimally as predicted by Bayesian Causal Inference; yet, the underlying neural mechanisms are unexplored. Combining psychophysics, Bayesian modeling, functional magnetic resonance imaging (fMRI), and multivariate decoding in an audiovisual spatial localization task, we demonstrate that Bayesian Causal Inference is performed by a hierarchy of multisensory processes in the human brain. At the bottom of the hierarchy, in auditory and visual areas, location is represented on the basis that the two signals are generated by independent sources (= segregation). At the next stage, in posterior intraparietal sulcus, location is estimated under the assumption that the two signals are from a common source (= forced fusion). Only at the top of the hierarchy, in anterior intraparietal sulcus, the uncertainty about the causal structure of the world is taken into account and sensory signals are combined as predicted by Bayesian Causal Inference. Characterizing the computational operations of signal interactions reveals the hierarchical nature of multisensory perception in human neocortex. It unravels how the brain accomplishes Bayesian Causal Inference, a statistical computation fundamental for perception and cognition. Our results demonstrate how the brain combines information in the face of uncertainty about the underlying causal structure of the world.

          Abstract

          The human brain uses Bayesian Causal Inference to integrate and segregate information by encoding multiple estimates of spatial relationships at different levels of the auditory and visual processing hierarchies. Read the accompanying Primer.

          Author Summary

          How can the brain integrate signals into a veridical percept of the environment without knowing whether they pertain to same or different events? For example, I can hear a bird and I can see a bird, but is it one bird singing on the branch, or is it two birds (one sitting on the branch and the other singing in the bush)? Recent studies demonstrate that human observers solve this problem optimally as predicted by Bayesian Causal Inference; yet, the neural mechanisms remain unclear. By combining psychophysics, Bayesian modelling, functional magnetic resonance imaging (fMRI), and multivariate decoding in an audiovisual localization task, we show that Bayesian Causal Inference is performed by a neural hierarchy of multisensory processes. At the bottom of the hierarchy, in auditory and visual areas, location is represented on the basis that the two signals are generated by independent sources (= segregation). At the next stage, in posterior intraparietal sulcus, location is estimated under the assumption that the two signals are from a common source (= forced fusion). Only at the top of the hierarchy, in anterior intraparietal sulcus, the uncertainty about the world’s causal structure is taken into account and sensory signals are combined as predicted by Bayesian Causal Inference.

          Related collections

          Most cited references30

          • Record: found
          • Abstract: found
          • Article: not found

          The ventriloquist effect results from near-optimal bimodal integration.

          Ventriloquism is the ancient art of making one's voice appear to come from elsewhere, an art exploited by the Greek and Roman oracles, and possibly earlier. We regularly experience the effect when watching television and movies, where the voices seem to emanate from the actors' lips rather than from the actual sound source. Originally, ventriloquism was explained by performers projecting sound to their puppets by special techniques, but more recently it is assumed that ventriloquism results from vision "capturing" sound. In this study we investigate spatial localization of audio-visual stimuli. When visual localization is good, vision does indeed dominate and capture sound. However, for severely blurred visual stimuli (that are poorly localized), the reverse holds: sound captures vision. For less blurred stimuli, neither sense dominates and perception follows the mean position. Precision of bimodal localization is usually better than either the visual or the auditory unimodal presentation. All the results are well explained not by one sense capturing the other, but by a simple model of optimal combination of visual and auditory information.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Is neocortex essentially multisensory?

            Although sensory perception and neurobiology are traditionally investigated one modality at a time, real world behaviour and perception are driven by the integration of information from multiple sensory sources. Mounting evidence suggests that the neural underpinnings of multisensory integration extend into early sensory processing. This article examines the notion that neocortical operations are essentially multisensory. We first review what is known about multisensory processing in higher-order association cortices and then discuss recent anatomical and physiological findings in presumptive unimodal sensory areas. The pervasiveness of multisensory influences on all levels of cortical processing compels us to reconsider thinking about neural processing in unisensory terms. Indeed, the multisensory nature of most, possibly all, of the neocortex forces us to abandon the notion that the senses ever operate independently during real-world cognition.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Neuronal oscillations and multisensory interaction in primary auditory cortex.

              Recent anatomical, physiological, and neuroimaging findings indicate multisensory convergence at early, putatively unisensory stages of cortical processing. The objective of this study was to confirm somatosensory-auditory interaction in A1 and to define both its physiological mechanisms and its consequences for auditory information processing. Laminar current source density and multiunit activity sampled during multielectrode penetrations of primary auditory area A1 in awake macaques revealed clear somatosensory-auditory interactions, with a novel mechanism: somatosensory inputs appear to reset the phase of ongoing neuronal oscillations, so that accompanying auditory inputs arrive during an ideal, high-excitability phase, and produce amplified neuronal responses. In contrast, responses to auditory inputs arriving during the opposing low-excitability phase tend to be suppressed. Our findings underscore the instrumental role of neuronal oscillations in cortical operations. The timing and laminar profile of the multisensory interactions in A1 indicate that nonspecific thalamic systems may play a key role in the effect.
                Bookmark

                Author and article information

                Contributors
                Role: Academic Editor
                Journal
                PLoS Biol
                PLoS Biol
                plos
                plosbiol
                PLoS Biology
                Public Library of Science (San Francisco, CA USA )
                1544-9173
                1545-7885
                24 February 2015
                February 2015
                : 13
                : 2
                : e1002073
                Affiliations
                [1 ]Max Planck Institute for Biological Cybernetics, Tuebingen, Germany
                [2 ]Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, United Kingdom
                Glasgow University, UNITED KINGDOM
                Author notes

                The authors have declared that no competing interests exist.

                Conceived and designed the experiments: TR UN. Performed the experiments: TR. Analyzed the data: TR UN. Contributed reagents/materials/analysis tools: TR UN. Wrote the paper: TR UN.

                Article
                PBIOLOGY-D-14-02149
                10.1371/journal.pbio.1002073
                4339735
                25710328
                b0459f9e-c7e6-4759-a7f5-b0360f6fe41b
                Copyright @ 2015

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited

                History
                : 18 June 2014
                : 9 January 2015
                Page count
                Figures: 3, Tables: 1, Pages: 18
                Funding
                This work was funded by the Max Planck Society and the European Research Council (ERC-StG-MultSens). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
                Categories
                Research Article
                Custom metadata
                All relevant data are within the paper and its Supporting Information files (Data S1).

                Life sciences
                Life sciences

                Comments

                Comment on this article