Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
11
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Deception detection and question effects: testing truth-default theory predictions in South Korea

      Human Communication Research
      Oxford University Press (OUP)

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Meta-analysis has shown that people are only slightly better than chance at distinguishing truths from lies in deception detection experiments. Truth-default theory (TDT), however, specifies multiple paths to lowering and increasing accuracy. The current experiment (n = 81) tested truth-default theory’s proposition 13 and diagnostic questioning module with a student sample from South Korea. The proposition and module predict that how an interviewee is questioned can affect deception detection in both directions, improving or reducing accuracy. Consistent with the original findings, questioning was found to significantly enhance (65%) and reduce (30%) deception-detection accuracy relative to the results of meta-analysis (54%). The current findings provide additional evidence consistent with TDT and replicate prior findings documenting substantial question effect on deception-detection accuracy. The implications of question effects for non-native speakers and intercultural lie detection are discussed.

          Related collections

          Most cited references7

          • Record: found
          • Abstract: found
          • Article: not found

          Accuracy of deception judgments.

          We analyze the accuracy of deception judgments, synthesizing research results from 206 documents and 24,483 judges. In relevant studies, people attempt to discriminate lies from truths in real time with no special aids or training. In these circumstances, people achieve an average of 54% correct lie-truth judgments, correctly classifying 47% of lies as deceptive and 61% of truths as nondeceptive. Relative to cross-judge differences in accuracy, mean lie-truth discrimination abilities are nontrivial, with a mean accuracy d of roughly .40. This produces an effect that is at roughly the 60th percentile in size, relative to others that have been meta-analyzed by social psychologists. Alternative indexes of lie-truth discrimination accuracy correlate highly with percentage correct, and rates of lie detection vary little from study to study. Our meta-analyses reveal that people are more accurate in judging audible than visible lies, that people appear deceptive when motivated to be believed, and that individuals regard their interaction partners as honest. We propose that people judge others' deceptions more harshly than their own and that this double standard in evaluating deceit can explain much of the accumulated literature.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Content in Context Improves Deception Detection Accuracy

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              A WORLD OF LIES.

              This article reports two worldwide studies of stereotypes about liars. These studies are carried out in 75 different countries and 43 different languages. In Study 1, participants respond to the open-ended question "How can you tell when people are lying?" In Study 2, participants complete a questionnaire about lying. These two studies reveal a dominant pan-cultural stereotype: that liars avert gaze. The authors identify other common beliefs and offer a social control interpretation.
                Bookmark

                Author and article information

                Contributors
                (View ORCID Profile)
                Journal
                Human Communication Research
                Oxford University Press (OUP)
                0360-3989
                1468-2958
                October 01 2023
                September 25 2023
                July 12 2023
                October 01 2023
                September 25 2023
                July 12 2023
                : 49
                : 4
                : 448-451
                Article
                10.1093/hcr/hqad026
                e8cca132-2fd3-43c1-bd91-26e43e7d990c
                © 2023

                https://academic.oup.com/pages/standard-publication-reuse-rights

                History

                Comments

                Comment on this article