Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
1
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Distinguishing between straight and curved sounds: Auditory shape in pitch, loudness, and tempo gestures

      research-article
      Attention, Perception & Psychophysics
      Springer US
      Sound gesture, Crossmodal correspondence, Audiovisual, Pitch, Loudness, Tempo

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Sound-based trajectories or sound gestures draw links to spatiokinetic processes. For instance, a gliding, decreasing pitch conveys an analogous downward motion or fall. Whereas the gesture’s pitch orientation and range convey its meaning and magnitude, respectively, the way in which pitch changes over time can be conceived of as gesture shape, which to date has rarely been studied in isolation. This article reports on an experiment that studied the perception of shape in uni-directional pitch, loudness, and tempo gestures, each assessed for four physical scalings. Gestures could increase or decrease over time and comprised different frequency and sound level ranges, durations, and different scaling contexts. Using a crossmodal-matching task, participants could reliably distinguish between pitch and loudness gestures and relate them to analogous visual line segments. Scalings based on equivalent-rectangular bandwidth (ERB) rate for pitch and raw signal amplitude for loudness were matched closest to a straight line, whereas other scalings led to perceptions of exponential or logarithmic curvatures. The investigated tempo gestures, by contrast, did not yield reliable differences. The reliable, robust perception of gesture shape for pitch and loudness has implications on various sound-design applications, especially those cases that rely on crossmodal mappings, e.g., visual analysis or control interfaces like audio waveforms or spectrograms. Given its perceptual relevance, auditory shape appears to be an integral part of sound gestures, while illustrating how crossmodal correspondences can underpin auditory perception.

          Related collections

          Most cited references32

          • Record: found
          • Abstract: found
          • Article: not found

          Derivation of auditory filter shapes from notched-noise data.

          A well established method for estimating the shape of the auditory filter is based on the measurement of the threshold of a sinusoidal signal in a notched-noise masker, as a function of notch width. To measure the asymmetry of the filter, the notch has to be placed both symmetrically and asymmetrically about the signal frequency. In previous work several simplifying assumptions and approximations were made in deriving auditory filter shapes from the data. In this paper we describe modifications to the fitting procedure which allow more accurate derivations. These include: 1) taking into account changes in filter bandwidth with centre frequency when allowing for the effects of off-frequency listening; 2) correcting for the non-flat frequency response of the earphone; 3) correcting for the transmission characteristics of the outer and middle ear; 4) limiting the amount by which the centre frequency of the filter can shift in order to maximise the signal-to-masker ratio. In many cases, these modifications result in only small changes to the derived filter shape. However, at very high and very low centre frequencies and for hearing-impaired subjects the differences can be substantial. It is also shown that filter shapes derived from data where the notch is always placed symmetrically about the signal frequency can be seriously in error when the underlying filter is markedly asymmetric. New formulae are suggested describing the variation of the auditory filter with frequency and level. The implication of the results for the calculation of excitation patterns are discussed and a modified procedure is proposed. The appendix list FORTRAN computer programs for deriving auditory filter shapes from notched-noise data and for calculating excitation patterns. The first program can readily be modified so as to derive auditory filter shapes from data obtained with other types of maskers, such as rippled noise.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Recommended effect size statistics for repeated measures designs.

            Investigators, who are increasingly implored to present and discuss effect size statistics, might comply more often if they understood more clearly what is required. When investigators wish to report effect sizes derived from analyses of variance that include repeated measures, past advice has been problematic. Only recently has a generally useful effect size statistic been proposed for such designs: generalized eta squared (eta2G; Olejnik & Algina, 2003). Here, we present this method, explain that eta2G preferred to eta squared and partial eta squared because it provides comparability across between-subjects and within-subjects designs, show that it can easily be computed from information provided by standard statistical packages, and recommend that investigators provide it routinely in their research reports when appropriate.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Suggested formulae for calculating auditory-filter bandwidths and excitation patterns.

              Recent estimates of auditory-filter shape are used to derive a simple formula relating the equivalent rectangular bandwidth (ERB) of the auditory filter to center frequency. The value of the auditory-filter bandwidth continues to decrease as center frequency decreases below 500 Hz. A formula is also given relating ERB-rate to frequency. Finally, a method is described for calculating excitation patterns from filter shapes.
                Bookmark

                Author and article information

                Contributors
                sven-amin.lembke@aru.ac.uk
                Journal
                Atten Percept Psychophys
                Atten Percept Psychophys
                Attention, Perception & Psychophysics
                Springer US (New York )
                1943-3921
                1943-393X
                18 September 2023
                18 September 2023
                2023
                : 85
                : 8
                : 2751-2773
                Affiliations
                Cambridge School of Creative Industries, Anglia Ruskin University, ( https://ror.org/0009t4v78) Cambridge, UK
                Author information
                http://orcid.org/0000-0003-4785-5826
                Article
                2764
                10.3758/s13414-023-02764-8
                10600048
                37721687
                a7c02784-2880-4ad1-b31c-52f8836e4f9f
                © The Author(s) 2023

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 4 July 2023
                Categories
                Article
                Custom metadata
                © The Psychonomic Society, Inc. 2023

                Clinical Psychology & Psychiatry
                sound gesture,crossmodal correspondence,audiovisual,pitch,loudness,tempo

                Comments

                Comment on this article