11
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Neural synchronization is strongest to the spectral flux of slow music and depends on familiarity and beat salience

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Neural activity in the auditory system synchronizes to sound rhythms, and brain–environment synchronization is thought to be fundamental to successful auditory perception. Sound rhythms are often operationalized in terms of the sound’s amplitude envelope. We hypothesized that – especially for music – the envelope might not best capture the complex spectro-temporal fluctuations that give rise to beat perception and synchronized neural activity. This study investigated (1) neural synchronization to different musical features, (2) tempo-dependence of neural synchronization, and (3) dependence of synchronization on familiarity, enjoyment, and ease of beat perception. In this electroencephalography study, 37 human participants listened to tempo-modulated music (1–4 Hz). Independent of whether the analysis approach was based on temporal response functions (TRFs) or reliable components analysis (RCA), the spectral flux of music – as opposed to the amplitude envelope – evoked strongest neural synchronization. Moreover, music with slower beat rates, high familiarity, and easy-to-perceive beats elicited the strongest neural response. Our results demonstrate the importance of spectro-temporal fluctuations in music for driving neural synchronization, and highlight its sensitivity to musical tempo, familiarity, and beat salience.

          eLife digest

          When we listen to a melody, the activity of our neurons synchronizes to the music: in fact, it is likely that the closer the match, the better we can perceive the piece. However, it remains unclear exactly which musical features our brain cells synchronize to. Previous studies, which have often used ‘simplified’ music, have highlighted that the amplitude envelope (how the intensity of the sounds changes over time) could be involved in this phenomenon, alongside factors such as musical training, attention, familiarity with the piece or even enjoyment. Whether differences in neural synchronization could explain why musical tastes vary between people is also still a matter of debate.

          In their study, Weineck et al. aim to better understand what drives neuronal synchronization to music. A technique known as electroencephalography was used to record brain activity in 37 volunteers listening to instrumental music whose tempo ranged from 60 to 240 beats per minute. The tunes varied across an array of features such as familiarity, enjoyment and how easy the beat was to perceive. Two different approaches were then used to calculate neural synchronization, which yielded converging results.

          The analyses revealed that three types of factors were associated with a strong neural synchronization. First, amongst the various cadences, a tempo of 60-120 beats per minute elicited the strongest match with neuronal activity. Interestingly, this beat is commonly found in Western pop music, is usually preferred by listeners, and often matches spontaneous body rhythms such as walking pace. Second, synchronization was linked to variations in pitch and sound quality (known as ‘spectral flux’) rather than in the amplitude envelope. And finally, familiarity and perceived beat saliency – but not enjoyment or musical expertise – were connected to stronger synchronization.

          These findings help to better understand how our brains allow us to perceive and connect with music. The work conducted by Weineck et al. should help other researchers to investigate this field; in particular, it shows how important it is to consider spectral flux rather than amplitude envelope in experiments that use actual music.

          Related collections

          Most cited references68

          • Record: found
          • Abstract: not found
          • Article: not found

          The Psychophysics Toolbox

            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data

            This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              The Musicality of Non-Musicians: An Index for Assessing Musical Sophistication in the General Population

              Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of ‘musical sophistication’ which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI) to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636). Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.
                Bookmark

                Author and article information

                Contributors
                Role: Reviewing Editor
                Role: Senior Editor
                Journal
                eLife
                Elife
                eLife
                eLife
                eLife Sciences Publications, Ltd
                2050-084X
                12 September 2022
                2022
                : 11
                : e75515
                Affiliations
                [1 ] Research Group “Neural and Environmental Rhythms”, Max Planck Institute for Empirical Aesthetics ( https://ror.org/000rdbk18) Frankfurt am Main Germany
                [2 ] Goethe University Frankfurt, Institute for Cell Biology and Neuroscience ( https://ror.org/04cvxnb49) Frankfurt am Main Germany
                [3 ] Department of Psychology, Toronto Metropolitan University ( https://ror.org/05g13zd79) Toronto Canada
                University of Birmingham ( https://ror.org/03angcq70) United Kingdom
                Carnegie Mellon University ( https://ror.org/05x2bcf33) United States
                University of Birmingham ( https://ror.org/03angcq70) United Kingdom
                University of Birmingham ( https://ror.org/03angcq70) United Kingdom
                CNRS ( https://ror.org/02feahw73) France
                Author information
                https://orcid.org/0000-0003-3204-860X
                https://orcid.org/0000-0001-8845-1233
                https://orcid.org/0000-0002-2284-8884
                Article
                75515
                10.7554/eLife.75515
                9467512
                36094165
                af223ebf-7304-4de5-8411-d53e36720c46
                © 2022, Weineck et al

                This article is distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use and redistribution provided that the original author and source are credited.

                History
                : 12 November 2021
                : 25 July 2022
                Funding
                Funded by: FundRef http://dx.doi.org/10.13039/501100000781, European Research Council;
                Award ID: ERC-STG-804029 BRAINSYNC
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/501100004189, Max-Planck-Gesellschaft;
                Award ID: Max Planck Research Group Grant
                Award Recipient :
                The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
                Categories
                Research Article
                Neuroscience
                Custom metadata
                Two different analysis approaches for measuring neural synchronization to natural music revealed strongest synchronization to musical spectral flux as opposed to the more commonly used amplitude envelope.

                Life sciences
                neural synchronization,music,temporal response function,tempo,reliable component analysis,spectral flux,human

                Comments

                Comment on this article