4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Functional cortical localization of tongue movements using corticokinematic coherence with a deep learning-assisted motion capture system

      research-article

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Corticokinematic coherence (CKC) between magnetoencephalographic and movement signals using an accelerometer is useful for the functional localization of the primary sensorimotor cortex (SM1). However, it is difficult to determine the tongue CKC because an accelerometer yields excessive magnetic artifacts. Here, we introduce a novel approach for measuring the tongue CKC using a deep learning-assisted motion capture system with videography, and compare it with an accelerometer in a control task measuring finger movement. Twelve healthy volunteers performed rhythmical side-to-side tongue movements in the whole-head magnetoencephalographic system, which were simultaneously recorded using a video camera and examined using a deep learning-assisted motion capture system. In the control task, right finger CKC measurements were simultaneously evaluated via motion capture and an accelerometer. The right finger CKC with motion capture was significant at the movement frequency peaks or its harmonics over the contralateral hemisphere; the motion-captured CKC was 84.9% similar to that with the accelerometer. The tongue CKC was significant at the movement frequency peaks or its harmonics over both hemispheres. The CKC sources of the tongue were considerably lateral and inferior to those of the finger. Thus, the CKC with deep learning-assisted motion capture can evaluate the functional localization of the tongue SM1.

          Related collections

          Most cited references54

          • Record: found
          • Abstract: not found
          • Article: not found

          The assessment and analysis of handedness: The Edinburgh inventory

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            DeepLabCut: markerless pose estimation of user-defined body parts with deep learning

            Quantifying behavior is crucial for many applications in neuroscience. Videography provides easy methods for the observation and recording of animal behavior in diverse settings, yet extracting particular aspects of a behavior for further analysis can be highly time consuming. In motor control studies, humans or other animals are often marked with reflective markers to assist with computer-based tracking, but markers are intrusive, and the number and location of the markers must be determined a priori. Here we present an efficient method for markerless pose estimation based on transfer learning with deep neural networks that achieves excellent results with minimal training data. We demonstrate the versatility of this framework by tracking various body parts in multiple species across a broad collection of behaviors. Remarkably, even when only a small number of frames are labeled (~200), the algorithm achieves excellent tracking performance on test frames that is comparable to human accuracy.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              The use of fast Fourier transform for the estimation of power spectra: A method based on time averaging over short, modified periodograms

              P. Welch (1967)
                Bookmark

                Author and article information

                Contributors
                maezawa@ndr.med.osaka-u.ac.jp
                Journal
                Sci Rep
                Sci Rep
                Scientific Reports
                Nature Publishing Group UK (London )
                2045-2322
                10 January 2022
                10 January 2022
                2022
                : 12
                : 388
                Affiliations
                [1 ]GRID grid.136593.b, ISNI 0000 0004 0373 3971, Department of Neurological Diagnosis and Restoration, Graduate School of Medicine, , Osaka University, ; Yamadaoka 2-2, Suita, Osaka 565-0871 Japan
                [2 ]GRID grid.266453.0, ISNI 0000 0001 0724 9317, Graduate School of Simulation Studies, , University of Hyogo, ; Minatojima-minamimachi 7-1-28, Chuo-ku, Kobe, Hyogo 650-0047 Japan
                [3 ]GRID grid.258799.8, ISNI 0000 0004 0372 2033, Graduate School of Medicine, Human Brain Research Center, , Kyoto University, ; Kawahara-cho 53, Sakyo-ku, Kyoto, 606-8507 Japan
                [4 ]GRID grid.417344.1, ISNI 0000 0004 0377 5581, Neurosurgery, , Otemae Hospital, ; Otemae1-5-34, Chuo-ku, Osaka, 540-0008 Japan
                [5 ]GRID grid.136593.b, ISNI 0000 0004 0373 3971, Center for Information and Neural Networks (CiNet), , National Institute of Information and Communications Technology, and Osaka University, ; Yamadaoka 1-4, Suita, Osaka 565-0871 Japan
                Article
                4469
                10.1038/s41598-021-04469-0
                8748830
                35013521
                a6488bde-cc63-40c8-b9d5-54e14da3f05b
                © The Author(s) 2022

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 30 August 2021
                : 23 December 2021
                Funding
                Funded by: FundRef http://dx.doi.org/10.13039/501100001691, Japan Society for the Promotion of Science;
                Award ID: 19K10218
                Award ID: 18H04166
                Award Recipient :
                Categories
                Article
                Custom metadata
                © The Author(s) 2022

                Uncategorized
                neuroscience,physiology,neurology
                Uncategorized
                neuroscience, physiology, neurology

                Comments

                Comment on this article