0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Crossmodal correspondence of elevation/pitch and size/pitch is driven by real-world features

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Crossmodal correspondences are consistent associations between sensory features from different modalities, with some theories suggesting they may either reflect environmental correlations or stem from innate neural structures. This study investigates this question by examining whether retinotopic or representational features of stimuli induce crossmodal congruency effects. Participants completed an auditory pitch discrimination task paired with visual stimuli varying in their sensory (retinotopic) or representational (scene integrated) nature, for both the elevation/pitch and size/pitch correspondences. Results show that only representational visual stimuli produced crossmodal congruency effects on pitch discrimination. These results support an environmental statistics hypothesis, suggesting crossmodal correspondences rely on real-world features rather than on sensory representations.

          Related collections

          Most cited references72

          • Record: found
          • Abstract: not found
          • Article: not found

          The Psychophysics Toolbox

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            The VideoToolbox software for visual psychophysics: transforming numbers into movies

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Crossmodal correspondences: a tutorial review.

              In many everyday situations, our senses are bombarded by many different unisensory signals at any given time. To gain the most veridical, and least variable, estimate of environmental stimuli/properties, we need to combine the individual noisy unisensory perceptual estimates that refer to the same object, while keeping those estimates belonging to different objects or events separate. How, though, does the brain "know" which stimuli to combine? Traditionally, researchers interested in the crossmodal binding problem have focused on the roles that spatial and temporal factors play in modulating multisensory integration. However, crossmodal correspondences between various unisensory features (such as between auditory pitch and visual size) may provide yet another important means of constraining the crossmodal binding problem. A large body of research now shows that people exhibit consistent crossmodal correspondences between many stimulus features in different sensory modalities. For example, people consistently match high-pitched sounds with small, bright objects that are located high up in space. The literature reviewed here supports the view that crossmodal correspondences need to be considered alongside semantic and spatiotemporal congruency, among the key constraints that help our brains solve the crossmodal binding problem.
                Bookmark

                Author and article information

                Contributors
                John.mcewan@uq.edu.au
                Journal
                Atten Percept Psychophys
                Atten Percept Psychophys
                Attention, Perception & Psychophysics
                Springer US (New York )
                1943-3921
                1943-393X
                26 October 2024
                26 October 2024
                2024
                : 86
                : 8
                : 2821-2833
                Affiliations
                School of Psychology, The University of Queensland, ( https://ror.org/00rqy9422) St. Lucia, QLD 4072 Australia
                Author information
                http://orcid.org/0000-0002-4869-3905
                Article
                2975
                10.3758/s13414-024-02975-7
                11652408
                39461934
                639024a3-2728-43f1-9b3b-32a59cd94b7b
                © The Author(s) 2024

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 9 October 2024
                Funding
                Funded by: The University of Queensland
                Categories
                Article
                Custom metadata
                © The Psychonomic Society, Inc. 2024

                Clinical Psychology & Psychiatry
                multisensory integration,neural mechanisms,scene perception

                Comments

                Comment on this article