3
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Two face masks are better than one: congruency effects in face matching

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Although the positive effects of congruency between stimuli are well replicated in face memory paradigms, mixed findings have been found in face matching. Due to the current COVID-19 pandemic, face masks are now very common during daily life outdoor activities. Thus, the present study aims to further explore congruency effects in matching faces partially occluded by surgical masks. Observers performed a face matching task consisting of pairs of faces presented in full view (i.e., full-view condition), pairs of faces in which only one of the faces had a mask (i.e., one-mask condition), and pairs of faces in which both faces had a mask (i.e., two-mask condition). Although face masks disrupted performance in identity match and identity mismatch trials, in match trials, we found better performance in the two-mask condition compared to the one-mask condition. This finding highlights the importance of congruency between stimuli on face matching when telling faces together.

          Related collections

          Most cited references60

          • Record: found
          • Abstract: not found
          • Article: not found

          Calculation of signal detection theory measures

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Corrections for extreme proportions and their biasing effects on estimated values ofd′

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The Cambridge Face Memory Test: results for neurologically intact individuals and an investigation of its validity using inverted face stimuli and prosopagnosic participants.

              The two standardized tests of face recognition that are widely used suffer from serious shortcomings [Duchaine, B. & Weidenfeld, A. (2003). An evaluation of two commonly used tests of unfamiliar face recognition. Neuropsychologia, 41, 713-720; Duchaine, B. & Nakayama, K. (2004). Developmental prosopagnosia and the Benton Facial Recognition Test. Neurology, 62, 1219-1220]. Images in the Warrington Recognition Memory for Faces test include substantial non-facial information, and the simultaneous presentation of faces in the Benton Facial Recognition Test allows feature matching. Here, we present results from a new test, the Cambridge Face Memory Test, which builds on the strengths of the previous tests. In the test, participants are introduced to six target faces, and then they are tested with forced choice items consisting of three faces, one of which is a target. For each target face, three test items contain views identical to those studied in the introduction, five present novel views, and four present novel views with noise. There are a total of 72 items, and 50 controls averaged 58. To determine whether the test requires the special mechanisms used to recognize upright faces, we conducted two experiments. We predicted that controls would perform much more poorly when the face images are inverted, and as predicted, inverted performance was much worse with a mean of 42. Next we assessed whether eight prosopagnosics would perform poorly on the upright version. The prosopagnosic mean was 37, and six prosopagnosics scored outside the normal range. In contrast, the Warrington test and the Benton test failed to classify a majority of the prosopagnosics as impaired. These results indicate that the new test effectively assesses face recognition across a wide range of abilities.
                Bookmark

                Author and article information

                Contributors
                aestudillo@bournemouth.ac.uk
                Journal
                Cogn Res Princ Implic
                Cogn Res Princ Implic
                Cognitive Research: Principles and Implications
                Springer International Publishing (Cham )
                2365-7464
                8 June 2022
                8 June 2022
                December 2022
                : 7
                : 49
                Affiliations
                [1 ]GRID grid.17236.31, ISNI 0000 0001 0728 4630, Department of Psychology, , Bournemouth University, ; Poole, BH12 5BB UK
                [2 ]GRID grid.440435.2, ISNI 0000 0004 1802 0472, University of Nottingham Malaysia, ; Semenyih, Malaysia
                Author information
                http://orcid.org/0000-0002-7760-318X
                Article
                402
                10.1186/s41235-022-00402-9
                9175166
                35674914
                36a6dc24-e748-4987-b4ea-5691e3995267
                © The Author(s) 2022

                Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 24 July 2021
                : 30 May 2022
                Categories
                Original Article
                Custom metadata
                © The Author(s) 2022

                Comments

                Comment on this article