3
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      ICF Linking and Cognitive Interviewing Are Complementary Methods for Optimizing Content Validity of Outcome Measures: An Integrated Methods Review

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Content validity is a fundamental requirement of outcome measures. After reviewing operational needs and existing definitions, content validity we as defined as: the extent to which a measure provides a comprehensive and true assessment of the key relevant elements of a specified construct or attribute across a defined range, clearly and equitably for a stated target audience and context. ICF linkage rules from 2002, 2005, and 2019 have provide increasingly clear processes for describing and evaluating content of outcome measures. ICF Core Sets provide international reference standards of the core constructs of importance for different health conditions. Both are important as reference standards during content validation. To summarize their use as reference standards, the following summary indicators were proposed: (1) Measure to ICF linkage, (2) Measure to (Brief or Comprehensive) Core Set Absolute Linkage, (3) Measure to (Brief or Comprehensive) Core Set Unique Linkage, (4) Core Set Representation, and (5) Core Set Unique Disability Representation. Methods to assess how respondents engage with content are needed to complement ICF-linking. Cognitive interviewing is an ideal method since it used to explore how respondents interpret and calibrate response to individual items on an outcome measure. We proposed a framework for classifying these responses: Clarity/Comprehension, Relevance, Inadequate response definition, Reference Point, Perspective modification, and Calibration Across Items. Our analysis of 24 manuscripts that used ICF linking for content validation since updated linking rules were published found that authors typically used linking to validate existing measures, involved multiple raters, used 2005 linking rules, summarized content at a concept level (e.g., impairment, activity, participation) and/or use core sets as a reference standard. Infrequently, ICF linking was used to create item pools/conceptual frameworks for new measures, applied the full scope of the 2019 linking rules, used summary indicators, or integrated ICF-linking with qualitative methods like cognitive interviews. We conclude that ICF linkage is a powerful tool for content validity during development or validation of PROM. Best practices include use of updated ICF linking rules, triangulation of ICF linking with participant assessments of clarity and relevance preferably obtained using cognitive interview methods, and application of defined summary indicators.

          Related collections

          Most cited references79

          • Record: found
          • Abstract: found
          • Article: not found

          Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups.

          Qualitative research explores complex phenomena encountered by clinicians, health care providers, policy makers and consumers. Although partial checklists are available, no consolidated reporting framework exists for any type of qualitative design. To develop a checklist for explicit and comprehensive reporting of qualitative studies (in depth interviews and focus groups). We performed a comprehensive search in Cochrane and Campbell Protocols, Medline, CINAHL, systematic reviews of qualitative studies, author or reviewer guidelines of major medical journals and reference lists of relevant publications for existing checklists used to assess qualitative studies. Seventy-six items from 22 checklists were compiled into a comprehensive list. All items were grouped into three domains: (i) research team and reflexivity, (ii) study design and (iii) data analysis and reporting. Duplicate items and those that were ambiguous, too broadly defined and impractical to assess were removed. Items most frequently included in the checklists related to sampling method, setting for data collection, method of data collection, respondent validation of findings, method of recording data, description of the derivation of themes and inclusion of supporting quotations. We grouped all items into three domains: (i) research team and reflexivity, (ii) study design and (iii) data analysis and reporting. The criteria included in COREQ, a 32-item checklist, can help researchers to report important aspects of the research team, study methods, context of the study, findings, analysis and interpretations.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Whatever happened to qualitative description?

            The general view of descriptive research as a lower level form of inquiry has influenced some researchers conducting qualitative research to claim methods they are really not using and not to claim the method they are using: namely, qualitative description. Qualitative descriptive studies have as their goal a comprehensive summary of events in the everyday terms of those events. Researchers conducting qualitative descriptive studies stay close to their data and to the surface of words and events. Qualitative descriptive designs typically are an eclectic but reasonable combination of sampling, and data collection, analysis, and re-presentation techniques. Qualitative descriptive study is the method of choice when straight descriptions of phenomena are desired. Copyright 2000 John Wiley & Sons,
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The content validity index: are you sure you know what's being reported? Critique and recommendations.

              Scale developers often provide evidence of content validity by computing a content validity index (CVI), using ratings of item relevance by content experts. We analyzed how nurse researchers have defined and calculated the CVI, and found considerable consistency for item-level CVIs (I-CVIs). However, there are two alternative, but unacknowledged, methods of computing the scale-level index (S-CVI). One method requires universal agreement among experts, but a less conservative method averages the item-level CVIs. Using backward inference with a purposive sample of scale development studies, we found that both methods are being used by nurse researchers, although it was not always possible to infer the calculation method. The two approaches can lead to different values, making it risky to draw conclusions about content validity. Scale developers should indicate which method was used to provide readers with interpretable content validity information. (c) 2006 Wiley Periodicals, Inc.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Rehabil Sci
                Front Rehabil Sci
                Front. Rehabilit. Sci.
                Frontiers in Rehabilitation Sciences
                Frontiers Media S.A.
                2673-6861
                2673-6861
                14 October 2021
                2021
                : 2
                : 702596
                Affiliations
                [1] 1Department of Surgery, School of Physical Therapy, Western University , London, ON, Canada
                [2] 2Hand and Upper Limb Centre, St. Joseph's Health Centre , London, ON, Canada
                Author notes

                Edited by: Soraya Maart, University of Cape Town, South Africa

                Reviewed by: Gabriel Ronen, McMaster University, Canada; Ntsikelelo Pefile, Stellenbosch University, South Africa

                *Correspondence: Joy C. MacDermid jmacderm@ 123456uwo.ca

                This article was submitted to Human Functioning, a section of the journal Frontiers in Rehabilitation Sciences

                Article
                10.3389/fresc.2021.702596
                9397968
                36188847
                39edf772-db12-4cc5-acfa-a7ad0bc09776
                Copyright © 2021 MacDermid.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 29 April 2021
                : 10 September 2021
                Page count
                Figures: 1, Tables: 4, Equations: 0, References: 81, Pages: 15, Words: 13267
                Funding
                Funded by: Canadian Institutes of Health Research, doi 10.13039/501100000024;
                Categories
                Rehabilitation Sciences
                Review

                icf,linking,content validity,prom,cognitive interviewing,methods,outcome measures

                Comments

                Comment on this article