2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Shared neural underpinnings of multisensory integration and trial-by-trial perceptual recalibration in humans

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Perception adapts to mismatching multisensory information, both when different cues appear simultaneously and when they appear sequentially. While both multisensory integration and adaptive trial-by-trial recalibration are central for behavior, it remains unknown whether they are mechanistically linked and arise from a common neural substrate. To relate the neural underpinnings of sensory integration and recalibration, we measured whole-brain magnetoencephalography while human participants performed an audio-visual ventriloquist task. Using single-trial multivariate analysis, we localized the perceptually-relevant encoding of multisensory information within and between trials. While we found neural signatures of multisensory integration within temporal and parietal regions, only medial superior parietal activity encoded past and current sensory information and mediated the perceptual recalibration within and between trials. These results highlight a common neural substrate of sensory integration and perceptual recalibration, and reveal a role of medial parietal regions in linking present and previous multisensory evidence to guide adaptive behavior.

          eLife digest

          A good ventriloquist will make their audience experience an illusion. The speech the spectators hear appears to come from the mouth of the puppet and not from the puppeteer. Moviegoers experience the same illusion: they perceive dialogue as coming from the mouths of the actors on screen, rather than from the loudspeakers mounted on the walls. Known as the ventriloquist effect, this ‘trick’ exists because the brain assumes that sights and sounds which occur at the same time have the same origin, and it therefore combines the two sets of sensory stimuli.

          A version of the ventriloquist effect can be induced in the laboratory. Participants hear a sound while watching a simple visual stimulus (for instance, a circle) appear on a screen. When asked to pinpoint the origin of the noise, volunteers choose a location shifted towards the circle, even if this was not where the sound came from. In addition, this error persists when the visual stimulus is no longer present: if a standard trial is followed by a trial that features a sound but no circle, participants perceive the sound in the second test as ‘drawn’ towards the direction of the former shift. This is known as the ventriloquist aftereffect.

          By scanning the brains of healthy volunteers performing this task, Park and Kayser show that a number of brain areas contribute to the ventriloquist effect. All of these regions help to combine what we see with what we hear, but only one maintains representations of the combined sensory inputs over time. Called the medial superior parietal cortex, this area is unique in contributing to both the ventriloquist effect and its aftereffect.

          We must constantly use past and current sensory information to adapt our behavior to the environment. The results by Park and Kayser shed light on the brain structures that underpin our capacity to combine information from several senses, as well as our ability to encode memories. Such knowledge should be useful to explore how we can make flexible decisions.

          Related collections

          Most cited references74

          • Record: found
          • Abstract: found
          • Article: not found

          The ventriloquist effect results from near-optimal bimodal integration.

          Ventriloquism is the ancient art of making one's voice appear to come from elsewhere, an art exploited by the Greek and Roman oracles, and possibly earlier. We regularly experience the effect when watching television and movies, where the voices seem to emanate from the actors' lips rather than from the actual sound source. Originally, ventriloquism was explained by performers projecting sound to their puppets by special techniques, but more recently it is assumed that ventriloquism results from vision "capturing" sound. In this study we investigate spatial localization of audio-visual stimuli. When visual localization is good, vision does indeed dominate and capture sound. However, for severely blurred visual stimuli (that are poorly localized), the reverse holds: sound captures vision. For less blurred stimuli, neither sense dominates and perception follows the mean position. Precision of bimodal localization is usually better than either the visual or the auditory unimodal presentation. All the results are well explained not by one sense capturing the other, but by a simple model of optimal combination of visual and auditory information.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Choice-specific sequences in parietal cortex during a virtual-navigation decision task

            The posterior parietal cortex (PPC) plays an important role in many cognitive behaviors; however, the neural circuit dynamics underlying PPC function are not well understood. Here we optically imaged the spatial and temporal activity patterns of neuronal populations in mice performing a PPC-dependent task that combined a perceptual decision and memory-guided navigation in a virtual environment. Individual neurons had transient activation staggered relative to one another in time, forming a sequence of neuronal activation spanning the entire length of a task trial. Distinct sequences of neurons were triggered on trials with opposite behavioral choices and defined divergent, choice-specific trajectories through a state space of neuronal population activity. Cells participating in the different sequences and at distinct time points in the task were anatomically intermixed over microcircuit length scales (< 100 micrometers). During working memory decision tasks the PPC may therefore perform computations through sequence-based circuit dynamics, rather than long-lived stable states, implemented using anatomically intermingled microcircuits.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              A category-free neural population supports evolving demands during decision-making

              The posterior parietal cortex (PPC) receives diverse inputs and is involved in a dizzying array of behaviors. These multiple behaviors could rely on distinct categories of neurons specialized to represent particular variables or could rely on a single population of PPC neurons that is leveraged in different ways. To distinguish these possibilities, we evaluated rat PPC neurons recorded during multisensory decisions. Novel tests revealed that task parameters and temporal response features were distributed randomly across neurons, without evidence of categories. This suggests that PPC neurons constitute a dynamic network that is decoded according to the animal’s current needs. To test for an additional signature of a dynamic network, we compared moments when behavioral demands differ: decision and movement. Our novel state-space analysis revealed that the network explored different dimensions during decision and movement. These observations suggest that a single network of neurons can support the evolving behavioral demands of decision-making.
                Bookmark

                Author and article information

                Contributors
                Role: Reviewing Editor
                Role: Senior Editor
                Journal
                eLife
                Elife
                eLife
                eLife
                eLife Sciences Publications, Ltd
                2050-084X
                27 June 2019
                2019
                : 8
                : e47001
                Affiliations
                [1 ]deptDepartment for Cognitive Neuroscience, Faculty of Biology Bielefeld University BielefeldGermany
                [2 ]deptCenter of Excellence Cognitive Interaction Technology Bielefeld University BielefeldGermany
                [3 ]deptInstitute of Neuroscience and Psychology University of Glasgow GlasgowUnited Kingdom
                University of Rochester United States
                Carnegie Mellon University United States
                University of Rochester United States
                University of Rochester United States
                Author information
                https://orcid.org/0000-0002-2191-2055
                https://orcid.org/0000-0001-7362-5704
                Article
                47001
                10.7554/eLife.47001
                6660215
                31246172
                ca5e9f72-817e-4805-a029-dc0662028ee8
                © 2019, Park and Kayser

                This article is distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use and redistribution provided that the original author and source are credited.

                History
                : 19 March 2019
                : 26 June 2019
                Funding
                Funded by: FundRef http://dx.doi.org/10.13039/100010663, H2020 European Research Council;
                Award ID: ERC-2014-CoG No 646657
                Award Recipient :
                The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
                Categories
                Research Article
                Neuroscience
                Custom metadata
                Facing discrepancies in the sensory environment, multisensory information is combined in the medial superior parietal cortex to guide immediate judgements and to also adjust subsequent unisensory perception.

                Life sciences
                multisensory integration,sensory recalibration,ventriloquist effect,ventriloquist after-effect,precuneus,sound localization,human

                Comments

                Comment on this article