10
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Upright Perception and Ocular Torsion Change Independently during Head Tilt

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We maintain a stable perception of the visual world despite continuous movements of our eyes, head and body. Perception of upright is a key aspect of such orientation constancy. Here we investigated whether changes in upright perception during sustained head tilt were related to simultaneous changes in torsional position of the eyes. We used a subjective visual vertical (SVV) task, modified to track changes in upright perception over time, and a custom video method to measure ocular torsion simultaneously. We tested 12 subjects in upright position, during prolonged (~15 min) lateral head tilts of 20 degrees, and also after the head returned to upright position. While the head was tilted, SVV drifted in the same direction as the head tilt (left tilt: −5.4 ± 1.4° and right tilt: +2.2 ± 2.1°). After the head returned to upright position, there was an SVV aftereffect with respect to the pre-tilt baseline, which was also in the same direction as the head tilt (left tilt: −3.9 ± 0.6° and right tilt: +2.55 ± 1.0°). Neither the SVV drift nor the SVV aftereffect were correlated with the changes in ocular torsion. Using the Bayesian spatial-perception model we show that the pattern of SVV drift and aftereffect in our results could be explained by a drift and an adaptation in sensory inputs that encode head orientation. The fact that ocular torsion (mainly driven by the otoliths) could not account for the perceptual changes suggests that neck proprioception could be the primary source of drift in upright perception during head tilt, and subsequently the aftereffect in upright position.

          Related collections

          Most cited references23

          • Record: found
          • Abstract: found
          • Article: not found

          A Bayesian model of the disambiguation of gravitoinertial force by visual cues.

          The otoliths are stimulated in the same fashion by gravitational and inertial forces, so otolith signals are ambiguous indicators of self-orientation. The ambiguity can be resolved with added visual information indicating orientation and acceleration with respect to the earth. Here we present a Bayesian model of the statistically optimal combination of noisy vestibular and visual signals. Likelihoods associated with sensory measurements are represented in an orientation/acceleration space. The likelihood function associated with the otolith signal illustrates the ambiguity; there is no unique solution for self-orientation or acceleration. Likelihood functions associated with other sensory signals can resolve this ambiguity. In addition, we propose two priors, each acting on a dimension in the orientation/acceleration space: the idiotropic prior and the no-acceleration prior. We conducted experiments using a motion platform and attached visual display to examine the influence of visual signals on the interpretation of the otolith signal. Subjects made pitch and acceleration judgments as the vestibular and visual signals were manipulated independently. Predictions of the model were confirmed: (1) visual signals affected the interpretation of the otolith signal, (2) less variable signals had more influence on perceived orientation and acceleration than more variable ones, and (3) combined estimates were more precise than single-cue estimates. We also show that the model can explain some well-known phenomena including the perception of upright in zero gravity, the Aubert effect, and the somatogravic illusion.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Multisensory processing in spatial orientation: an inverse probabilistic approach.

            Most evidence that the brain uses Bayesian inference to integrate noisy sensory signals optimally has been obtained by showing that the noise levels in each modality separately can predict performance in combined conditions. Such a forward approach is difficult to implement when the various signals cannot be measured in isolation, as in spatial orientation, which involves the processing of visual, somatosensory, and vestibular cues. Instead, we applied an inverse probabilistic approach, based on optimal observer theory. Our goal was to investigate whether the perceptual differences found when probing two different states--body-in-space and head-in-space orientation--can be reconciled by a shared scheme using all available sensory signals. Using a psychometric approach, seven human subjects were tested on two orientation estimates at tilts < 120°: perception of body tilt [subjective body tilt (SBT)] and perception of visual vertical [subjective visual vertical (SVV)]. In all subjects, the SBT was more accurate than the SVV, which showed substantial systematic errors for tilt angles beyond 60°. Variability increased with tilt angle in both tasks, but was consistently lower in the SVV. The sensory integration model fitted both datasets very nicely. A further experiment, in which supine subjects judged their head orientation relative to the body, independently confirmed the predicted head-on-body noise by the model. Model predictions based on the derived noise properties from the various modalities were also consistent with previously published deficits in vestibular and somatosensory patients. We conclude that Bayesian computations can account for the typical differences in spatial orientation judgments associated with different task requirements.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Properties of the internal representation of gravity inferred from spatial-direction and body-tilt estimates.

              One of the key questions in spatial perception is whether the brain has a common representation of gravity that is generally accessible for various perceptual orientation tasks. To evaluate this idea, we compared the ability of six tilted subjects to indicate earth-centric directions in the dark with a visual and an oculomotor paradigm and to estimate their body tilt relative to gravity. Subjective earth-horizontal and -vertical data were collected, either by adjusting a visual line or by making saccades, at 37 roll-tilt angles across the entire range. These spatial perception responses and the associated body-tilt estimates were subjected to a principal-component analysis to describe their tilt dependence. This analysis allowed us to separate systematic and random errors in performance, to disentangle the effects of task (horizontal vs. vertical) and paradigm (visual vs. oculomotor) in the space-perception data, and to compare the veridicality of space perception and the sense of self-tilt. In all spatial-orientation tests, whether involving space-perception or body-tilt judgments, subjects made considerable systematic errors which mostly betrayed tilt underestimation [Aubert effect (A effect)] and peaked near 130 degrees tilt. However, the A effect was much smaller in body-tilt estimates than in spatial pointing, implying that the underlying signal processing must have been different. Pointing results obtained with the visual and the oculomotor paradigm were not identical either, but these differences, which were task-related (horizontal vs. vertical), were subtle in comparison. The tilt-dependent pattern of random errors (noisy scatter) was almost identical in visual and oculomotor pointing results, showing a steep monotonic increase with tilt angle, but was again clearly different in the body-tilt estimates. These findings are discussed in the context of a conceptual model in an attempt to explain how the different patterns of systematic and random errors in external-space and self-tilt perception may come about. The scheme proposes that basically similar computational mechanisms, working with different settings, may be responsible.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Hum Neurosci
                Front Hum Neurosci
                Front. Hum. Neurosci.
                Frontiers in Human Neuroscience
                Frontiers Media S.A.
                1662-5161
                17 November 2016
                2016
                : 10
                : 573
                Affiliations
                [1] 1Department of Neurology, The Johns Hopkins University School of Medicine Baltimore, MD, USA
                [2] 2Department of Otolaryngology-Head and Neck Surgery, The Johns Hopkins University School of Medicine Baltimore, MD, USA
                Author notes

                Edited by: Rachael D. Seidler, University of Michigan, USA

                Reviewed by: Tim Kiemel, University of Maryland College Park, USA; Torin K. Clark, University of Colorado Boulder, USA

                *Correspondence: Amir Kheradmand akherad@ 123456jhu.edu
                Article
                10.3389/fnhum.2016.00573
                5112230
                27909402
                82a566dd-faf1-4f04-8e75-0567e1535c22
                Copyright © 2016 Otero-Millan and Kheradmand.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution and reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 21 July 2016
                : 28 October 2016
                Page count
                Figures: 5, Tables: 0, Equations: 18, References: 26, Pages: 9, Words: 6877
                Funding
                Funded by: National Institutes of Health 10.13039/100000002
                Award ID: Fight for Sight, Leon Levy
                Categories
                Neuroscience
                Original Research

                Neurosciences
                subjective visual vertical,svv,upright perception,torsional eye position,ocular torsion,aftereffect,head tilt

                Comments

                Comment on this article

                scite_
                0
                0
                0
                0
                Smart Citations
                0
                0
                0
                0
                Citing PublicationsSupportingMentioningContrasting
                View Citations

                See how this article has been cited at scite.ai

                scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.

                Similar content170

                Cited by17

                Most referenced authors115