2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Perceived rhythmic regularity is greater for song than speech: examining acoustic correlates of rhythmic regularity in speech and song

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Rhythm is a key feature of music and language, but the way rhythm unfolds within each domain differs. Music induces perception of a beat, a regular repeating pulse spaced by roughly equal durations, whereas speech does not have the same isochronous framework. Although rhythmic regularity is a defining feature of music and language, it is difficult to derive acoustic indices of the differences in rhythmic regularity between domains. The current study examined whether participants could provide subjective ratings of rhythmic regularity for acoustically matched (syllable-, tempo-, and contour-matched) and acoustically unmatched (varying in tempo, syllable number, semantics, and contour) exemplars of speech and song. We used subjective ratings to index the presence or absence of an underlying beat and correlated ratings with stimulus features to identify acoustic metrics of regularity. Experiment 1 highlighted that ratings based on the term “rhythmic regularity” did not result in consistent definitions of regularity across participants, with opposite ratings for participants who adopted a beat-based definition (song greater than speech), a normal-prosody definition (speech greater than song), or an unclear definition (no difference). Experiment 2 defined rhythmic regularity as how easy it would be to tap or clap to the utterances. Participants rated song as easier to clap or tap to than speech for both acoustically matched and unmatched datasets. Subjective regularity ratings from Experiment 2 illustrated that stimuli with longer syllable durations and with less spectral flux were rated as more rhythmically regular across domains. Our findings demonstrate that rhythmic regularity distinguishes speech from song and several key acoustic features can be used to predict listeners’ perception of rhythmic regularity within and across domains as well.

          Related collections

          Most cited references95

          • Record: found
          • Abstract: not found
          • Article: not found

          Confidence Intervals from Normalized Data: A correction to Cousineau (2005)

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Speech recognition with primarily temporal cues.

            Nearly perfect speech recognition was observed under conditions of greatly reduced spectral information. Temporal envelopes of speech were extracted from broad frequency bands and were used to modulate noises of the same bandwidths. This manipulation preserved temporal envelope cues in each band but restricted the listener to severely degraded information on the distribution of spectral energy. The identification of consonants, vowels, and words in simple sentences improved markedly as the number of bands increased; high speech recognition performance was obtained with only three bands of modulated noise. Thus, the presentation of a dynamic temporal pattern in only a few broad spectral regions is sufficient for the recognition of speech.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Neural Mechanisms of Sustained Attention Are Rhythmic

              Classic models of attention suggest that sustained neural firing constitutes a neural correlate of sustained attention. However, recent evidence indicates that behavioral performance fluctuates over time, exhibiting temporal dynamics that closely resemble the spectral features of ongoing, oscillatory brain activity. Therefore, it has been proposed that periodic neuronal excitability fluctuations might shape attentional allocation and overt behavior. However, empirical evidence to support this notion is sparse. Here, we address this issue by examining data from large-scale subdural recordings, using two different attention tasks that track perceptual ability at high temporal resolution. Our results reveal that perceptual outcome varies as a function of the theta phase even in states of sustained spatial attention. These effects were robust at the single-subject level, suggesting that rhythmic perceptual sampling is an inherent property of the frontoparietal attention network. Collectively, these findings support the notion that the functional architecture of top-down attention is intrinsically rhythmic. Helfrich et al. demonstrate that the neural basis of sustained attention is rhythmic. Using human intracranial recordings, they show that attentional allocation and overt behavior are modulated by a ~4 Hz theta rhythm that predicts endogenous excitability fluctuations.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Psychol
                Front Psychol
                Front. Psychol.
                Frontiers in Psychology
                Frontiers Media S.A.
                1664-1078
                26 May 2023
                2023
                : 14
                : 1167003
                Affiliations
                [1] 1The Brain and Mind Institute, Western University , London, ON, Canada
                [2] 2Department of Psychology, Western University , London, ON, Canada
                [3] 3Department of Psychology, University of Toronto, Mississauga , ON, Canada
                Author notes

                Edited by: Dan Zhang, Tsinghua University, China

                Reviewed by: Yue Ding, Shanghai Mental Health Center, China; Juan Huang, Johns Hopkins University, United States

                *Correspondence: Christina Vanden Bosch der Nederlanden, c.dernederlanden@ 123456utoronto.ca
                Article
                10.3389/fpsyg.2023.1167003
                10250601
                6e456f36-0580-4dc4-94fe-92774e31307b
                Copyright © 2023 Yu, Cabildo, Grahn and Vanden Bosch der Nederlanden.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 15 February 2023
                : 09 May 2023
                Page count
                Figures: 3, Tables: 2, Equations: 0, References: 103, Pages: 12, Words: 10112
                Categories
                Psychology
                Original Research
                Custom metadata
                Auditory Cognitive Neuroscience

                Clinical Psychology & Psychiatry
                rhythmic regularity,beat,speech,song,music information retrieval,periodicity,rhythm

                Comments

                Comment on this article