2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Neural substrates of the ability to recognize facial expressions: a voxel-based morphometry study

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The recognition of facial expressions of emotion is adaptive for human social interaction, but the ability to do this and the manner in which it is achieved differs among individuals. Previous functional neuroimaging studies have demonstrated that some brain regions, such as the inferior frontal gyrus (IFG), are active during the response to emotional facial expressions in healthy participants, and lesion studies have demonstrated that damage to these structures impairs the recognition of facial expressions. However, it remains to be established whether individual differences in the structure of these regions could be associated with differences in the ability to recognize facial expressions. We investigated this issue using acquired structural magnetic resonance imaging, and assessed the performance of healthy adults with respect to recognition of the facial expressions of six basic emotions. The gray matter volume of the right IFG positively correlated with the total accuracy of facial expression recognition. This suggests that individual differences in the ability to recognize facial expressions are associated with differences in the structure of the right IFG. Furthermore, the mirror neuron activity of the IFG may be important for establishing efficient facial mimicry to facilitate emotion recognition.

          Related collections

          Most cited references46

          • Record: found
          • Abstract: found
          • Article: not found

          A unified statistical approach for determining significant signals in images of cerebral activation.

          We present a unified statistical theory for assessing the significance of apparent signal observed in noisy difference images. The results are usable in a wide range of applications, including fMRI, but are discussed with particular reference to PET images which represent changes in cerebral blood flow elicited by a specific cognitive or sensorimotor task. Our main result is an estimate of the P-value for local maxima of Gaussian, t, chi(2) and F fields over search regions of any shape or size in any number of dimensions. This unifies the P-values for large search areas in 2-D (Friston et al. [1991]: J Cereb Blood Flow Metab 11:690-699) large search regions in 3-D (Worsley et al. [1992]: J Cereb Blood Flow Metab 12:900-918) and the usual uncorrected P-value at a single pixel or voxel. Copyright (c) 1996 Wiley-Liss, Inc.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Two systems for empathy: a double dissociation between emotional and cognitive empathy in inferior frontal gyrus versus ventromedial prefrontal lesions.

            Recent evidence suggests that there are two possible systems for empathy: a basic emotional contagion system and a more advanced cognitive perspective-taking system. However, it is not clear whether these two systems are part of a single interacting empathy system or whether they are independent. Additionally, the neuroanatomical bases of these systems are largely unknown. In this study, we tested the hypothesis that emotional empathic abilities (involving the mirror neuron system) are distinct from those related to cognitive empathy and that the two depend on separate anatomical substrates. Subjects with lesions in the ventromedial prefrontal (VM) or inferior frontal gyrus (IFG) cortices and two control groups were assessed with measures of empathy that incorporate both cognitive and affective dimensions. The findings reveal a remarkable behavioural and anatomic double dissociation between deficits in cognitive empathy (VM) and emotional empathy (IFG). Furthermore, precise anatomical mapping of lesions revealed Brodmann area 44 to be critical for emotional empathy while areas 11 and 10 were found necessary for cognitive empathy. These findings are consistent with these cortices being different in terms of synaptic hierarchy and phylogenetic age. The pattern of empathy deficits among patients with VM and IFG lesions represents a first direct evidence of a double dissociation between emotional and cognitive empathy using the lesion method.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Functional atlas of emotional faces processing: a voxel-based meta-analysis of 105 functional magnetic resonance imaging studies.

              Most of our social interactions involve perception of emotional information from the faces of other people. Furthermore, such emotional processes are thought to be aberrant in a range of clinical disorders, including psychosis and depression. However, the exact neurofunctional maps underlying emotional facial processing are not well defined. Two independent researchers conducted separate comprehensive PubMed (1990 to May 2008) searches to find all functional magnetic resonance imaging (fMRI) studies using a variant of the emotional faces paradigm in healthy participants. The search terms were: "fMRI AND happy faces," "fMRI AND sad faces," "fMRI AND fearful faces," "fMRI AND angry faces," "fMRI AND disgusted faces" and "fMRI AND neutral faces." We extracted spatial coordinates and inserted them in an electronic database. We performed activation likelihood estimation analysis for voxel-based meta-analyses. Of the originally identified studies, 105 met our inclusion criteria. The overall database consisted of 1785 brain coordinates that yielded an overall sample of 1600 healthy participants. Quantitative voxel-based meta-analysis of brain activation provided neurofunctional maps for 1) main effect of human faces; 2) main effect of emotional valence; and 3) modulatory effect of age, sex, explicit versus implicit processing and magnetic field strength. Processing of emotional faces was associated with increased activation in a number of visual, limbic, temporoparietal and prefrontal areas; the putamen; and the cerebellum. Happy, fearful and sad faces specifically activated the amygdala, whereas angry or disgusted faces had no effect on this brain region. Furthermore, amygdala sensitivity was greater for fearful than for happy or sad faces. Insular activation was selectively reported during processing of disgusted and angry faces. However, insular sensitivity was greater for disgusted than for angry faces. Conversely, neural response in the visual cortex and cerebellum was observable across all emotional conditions. Although the activation likelihood estimation approach is currently one of the most powerful and reliable meta-analytical methods in neuroimaging research, it is insensitive to effect sizes. Our study has detailed neurofunctional maps to use as normative references in future fMRI studies of emotional facial processing in psychiatric populations. We found selective differences between neural networks underlying the basic emotions in limbic and insular brain regions.
                Bookmark

                Author and article information

                Journal
                Soc Cogn Affect Neurosci
                Soc Cogn Affect Neurosci
                scan
                Social Cognitive and Affective Neuroscience
                Oxford University Press
                1749-5016
                1749-5024
                March 2017
                26 September 2016
                26 September 2016
                : 12
                : 3
                : 487-495
                Affiliations
                [1 ]Graduate School of Medicine, Kyoto University, 53 Shogoin-Kawahara-cho, Sakyo-ku, Kyoto 606-8507, Japan
                [2 ]ATR Brain Activity Imaging Center, 2-2-2, Hikaridai, Seika-cho, Souraku-gun, Kyoto 619-0288, Japan
                [3 ]The Organization for Promoting Neurodevelopmental Disorder Research, 40 Shogoin-Sannocho, Sakyo-ku, Kyoto 606-8392, Japan
                [4 ]Health and Medical Services Center, Shiga University, 1-1-1, Baba, Hikone, Shiga 522-8522, Japan
                Author notes
                Correspondence should be addressed to Shota Uono, Department of Neurodevelopmental Psychiatry, Habilitation, and Rehabilitation, Graduate School of Medicine, Kyoto University, 53 Shogoin-Kawahara-cho, Sakyo-ku, Kyoto 606-8507, Japan. E-mail: uonoshota1982@ 123456gmail.com.
                Article
                nsw142
                10.1093/scan/nsw142
                5390731
                27672176
                b00aca52-fc6a-41f7-a3aa-2faf0e8f8470
                © The Author(s) (2016). Published by Oxford University Press.

                This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License ( http://creativecommons.org/licenses/by-nc/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited. For commercial re-use, please contact journals.permissions@oup.com

                History
                : 14 April 2016
                : 28 August 2016
                : 21 September 2016
                Page count
                Pages: 9
                Funding
                Funded by: JSPS Funding Program for Next Generation World-Leading Researchers
                Award ID: LZ008
                Categories
                Original Articles

                Neurosciences
                cerebellum,facial expression recognition,inferior frontal gyrus,superior temporal gyrus,voxel-based morphometry

                Comments

                Comment on this article