24
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Conceptual, methodological, and measurement factors that disqualify use of measurement invariance techniques to detect informant discrepancies in youth mental health assessments

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          On page 1 of his classic text, Millsap (2011) states, “Measurement invariance is built on the notion that a measuring device should function the same way across varied conditions, so long as those varied conditions are irrelevant [emphasis added] to the attribute being measured.” By construction, measurement invariance techniques require not only detecting varied conditions but also ruling out that these conditions inform our understanding of measured domains (i.e., conditions that do not contain domain-relevant information). In fact, measurement invariance techniques possess great utility when theory and research inform their application to specific, varied conditions (e.g., cultural, ethnic, or racial background of test respondents) that, if not detected, introduce measurement biases, and, thus, depress measurement validity (e.g., academic achievement and intelligence). Yet, we see emerging bodies of work where scholars have “put the cart before the horse” when it comes to measurement invariance, and they apply these techniques to varied conditions that, in fact, may reflect domain-relevant information. These bodies of work highlight a larger problem in measurement that likely cuts across many areas of scholarship. In one such area, youth mental health, researchers commonly encounter a set of conditions that nullify the use of measurement invariance, namely discrepancies between survey reports completed by multiple informants, such as parents, teachers, and youth themselves (i.e., informant discrepancies). In this paper, we provide an overview of conceptual, methodological, and measurement factors that should prevent researchers from applying measurement invariance techniques to detect informant discrepancies. Along the way, we cite evidence from the last 15 years indicating that informant discrepancies reflect domain-relevant information. We also apply this evidence to recent uses of measurement invariance techniques in youth mental health. Based on prior evidence, we highlight the implications of applying these techniques to multi-informant data, when the informant discrepancies observed within these data might reflect domain-relevant information. We close by calling for a moratorium on applying measurement invariance techniques to detect informant discrepancies in youth mental health assessments. In doing so, we describe how the state of the science would need to fundamentally “flip” to justify applying these techniques to detect informant discrepancies in this area of work.

          Related collections

          Most cited references126

          • Record: found
          • Abstract: found
          • Article: not found

          PSYCHOLOGY. Estimating the reproducibility of psychological science.

          Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Child/adolescent behavioral and emotional problems: implications of cross-informant correlations for situational specificity.

              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Twenty Years' Research on Peer Victimization and Psychosocial Maladjustment: A Meta-analytic Review of Cross-sectional Studies

                Bookmark

                Author and article information

                Contributors
                Journal
                Front Psychol
                Front Psychol
                Front. Psychol.
                Frontiers in Psychology
                Frontiers Media S.A.
                1664-1078
                02 August 2022
                2022
                : 13
                : 931296
                Affiliations
                [1] 1Comprehensive Assessment and Intervention Program, Department of Psychology, The University of Maryland at College Park , College Park, MD, United States
                [2] 2Resilient Adaptation Across Culture and Context Lab, Department of Psychology, The University of Maryland at College Park , College Park, MD, United States
                [3] 3Department of Psychological Sciences, University of Missouri , Columbia, MO, United States
                [4] 4Anxiety and Illness Behaviour Laboratory, Department of Psychology, University of Regina , Regina, SK, Canada
                Author notes

                Edited by: Scott Thomas Meier, University at Buffalo, United States

                Reviewed by: Wendy Guyker, University at Buffalo, United States; James Mcdougal, SUNY Oswego, United States

                *Correspondence: Andres De Los Reyes, adlr@ 123456umd.edu

                This article was submitted to Quantitative Psychology and Measurement, a section of the journal Frontiers in Psychology

                Article
                10.3389/fpsyg.2022.931296
                9378825
                35983202
                33455230-bc07-4acb-b832-3d331632caf9
                Copyright © 2022 De Los Reyes, Tyrell, Watts and Asmundson.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 28 April 2022
                : 24 June 2022
                Page count
                Figures: 4, Tables: 2, Equations: 0, References: 121, Pages: 18, Words: 14355
                Funding
                Funded by: Institute of Education Sciences, doi 10.13039/100005246;
                Funded by: National Institutes of Health, doi 10.13039/100000002;
                Funded by: Fulbright Canada, doi 10.13039/100010081;
                Categories
                Psychology
                Conceptual Analysis

                Clinical Psychology & Psychiatry
                converging operations,diverging operations,domain-relevant information,informant discrepancies,operations triad model

                Comments

                Comment on this article

                scite_

                Similar content83

                Cited by8

                Most referenced authors1,292