16
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Development and Validation of Computerized Adaptive Assessment Tools for the Measurement of Posttraumatic Stress Disorder Among US Military Veterans

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Key Points

          Question

          Can rapid psychometrically sound adaptive diagnostic screening and dimensional severity measures be developed for posttraumatic stress disorder?

          Findings

          In this diagnostic study including 713 US military veterans, the Computerized Adaptive Diagnostic–Posttraumatic Stress Disorder measure was shown to have excellent diagnostic accuracy. The Computerized Adaptive Test–Posttraumatic Stress Disorder also provided valid severity ratings and demonstrated convergent validity with the Post-Traumatic Stress Disorder checklist for Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition.

          Meaning

          In this study, the Computerized Adaptive Diagnostic–Posttraumatic Stress Disorder and Computerized Adaptive Test–Posttraumatic Stress Disorder measures appeared to provide valid screening diagnoses and severity scores, with substantial reductions in patient and clinician burden.

          Abstract

          Importance

          Veterans from recent and past conflicts have high rates of posttraumatic stress disorder (PTSD). Adaptive testing strategies can increase accuracy of diagnostic screening and symptom severity measurement while decreasing patient and clinician burden.

          Objective

          To develop and validate a computerized adaptive diagnostic (CAD) screener and computerized adaptive test (CAT) for PTSD symptom severity.

          Design, Setting, and Participants

          A diagnostic study of measure development and validation was conducted at a Veterans Health Administration facility. A total of 713 US military veterans were included. The study was conducted from April 25, 2017, to November 10, 2019.

          Main Outcomes and Measures

          The participants completed a PTSD-symptom questionnaire from the item bank and provided responses on the PTSD Checklist for Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition ( DSM-5) (PCL-5). A subsample of 304 participants were interviewed using the Clinician-Administered Scale for PTSD for DSM-5.

          Results

          Of the 713 participants, 585 were men; mean (SD) age was 52.8 (15.0) years. The CAD-PTSD reproduced the Clinician-Administered Scale for PTSD for DSM-5 PTSD diagnosis with high sensitivity and specificity as evidenced by an area under the curve of 0.91 (95% CI, 0.87-0.95). The CAT-PTSD demonstrated convergent validity with the PCL-5 ( r = 0.88) and also tracked PTSD diagnosis (area under the curve = 0.85; 95% CI, 0.79-0.89). The CAT-PTSD reproduced the final 203-item bank score with a correlation of r = 0.95 with a mean of only 10 adaptively administered items, a 95% reduction in patient burden.

          Conclusions and Relevance

          Using a maximum of only 6 items, the CAD-PTSD developed in this study was shown to have excellent diagnostic screening accuracy. Similarly, using a mean of 10 items, the CAT-PTSD provided valid severity ratings with excellent convergent validity with an extant scale containing twice the number of items. The 10-item CAT-PTSD also outperformed the 20-item PCL-5 in terms of diagnostic accuracy. The results suggest that scalable, valid, and rapid PTSD diagnostic screening and severity measurement are possible.

          Abstract

          This diagnostic study develops and validates a tool for use in diagnostic screening of US military veterans for posttraumatic stress disorder.

          Related collections

          Most cited references45

          • Record: found
          • Abstract: found
          • Article: not found

          Research electronic data capture (REDCap)--a metadata-driven methodology and workflow process for providing translational research informatics support.

          Research electronic data capture (REDCap) is a novel workflow methodology and software solution designed for rapid development and deployment of electronic data capture tools to support clinical and translational research. We present: (1) a brief description of the REDCap metadata-driven software toolset; (2) detail concerning the capture and use of study-related metadata from scientific research teams; (3) measures of impact for REDCap; (4) details concerning a consortium network of domestic and international institutions collaborating on the project; and (5) strengths and limitations of the REDCap system. REDCap is currently supporting 286 translational research projects in a growing collaborative network including 27 active partner institutions.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            The PHQ-9: A New Depression Diagnostic and Severity Measure

              Bookmark
              • Record: found
              • Abstract: not found
              • Book: not found

              Applied Logistic Regression

                Bookmark

                Author and article information

                Journal
                JAMA Netw Open
                JAMA Netw Open
                JAMA Netw Open
                JAMA Network Open
                American Medical Association
                2574-3805
                8 July 2021
                July 2021
                8 July 2021
                : 4
                : 7
                : e2115707
                Affiliations
                [1 ]VA Rocky Mountain Mental Illness Research, Education and Clinical Center, Rocky Mountain Regional Veterans Affairs Medical Center, Eastern Colorado Health Care System, Aurora
                [2 ]Department of Physical Medicine & Rehabilitation, University of Colorado, Anschutz Medical Campus, Aurora
                [3 ]Department of Psychiatry & Neurology, University of Colorado, Anschutz Medical Campus, Aurora
                [4 ]Pittsburgh School of Medicine, Pittsburgh, Pennsylvania
                [5 ]Department of Medicine, University of Chicago, Chicago, Illinois
                [6 ]Department of Computer Science, University of Chicago, Chicago, Illinois
                [7 ]Committee on Quantitative Methods, University of Chicago, Chicago, Illinois
                [8 ]Committee on Genetics, Genomics & Systems Biology, University of Chicago, Chicago, Illinois
                [9 ]Center for Health Statistics, University of Chicago, Chicago, Illinois
                Author notes
                Article Information
                Accepted for Publication: April 20, 2021.
                Published: July 8, 2021. doi:10.1001/jamanetworkopen.2021.15707
                Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2021 Brenner LA et al. JAMA Network Open.
                Corresponding Author: Lisa A. Brenner, PhD, VA Rocky Mountain Mental Illness, Research, Education and Clinical Center, Rocky Mountain Regional Veterans Affairs Medical Center, Eastern Colorado Health Care System, 1700 N Wheeling St, Aurora, CO 80045 ( lisa.2.brenner@ 123456cuanschutz.edu ).
                Author Contributions: Dr Brenner had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
                Concept and design: Brenner, Germain, Frank, Kupfer, Gibbons.
                Acquisition, analysis, or interpretation of data: Brenner, Betthauser, Penzenik, Li, Chattopadhyay, Gibbons.
                Drafting of the manuscript: Brenner, Betthauser, Penzenik, Gibbons.
                Critical revision of the manuscript for important intellectual content: Penzenik, Germain, Li, Chattopadhyay, Frank, Kupfer, Gibbons.
                Statistical analysis: Penzenik, Li, Chattopadhyay, Gibbons.
                Obtained funding: Brenner.
                Administrative, technical, or material support: Brenner, Betthauser, Penzenik.
                Conflict of Interest Disclosures: Dr Brenner reported receiving fees as a consultant for sports leagues and has received royalties from American Psychological Association Publishing. Dr Gibbons has been an expert witness for the US Department of Justice, Merck, GlaxoSmithKline, Pfizer, and Wyeth and is a founder of Adaptive Testing Technologies, which distributes the Computerized Adaptive Test–Posttraumatic Stress Disorder–Mental Health (CAT-MH) battery of adaptive tests in which CAT–Posttraumatic Stress Disorder (PTSD) is included. The terms of this arrangement have been reviewed and approved by the University of Chicago in accordance with its conflict of interest policies. Dr Frank is a founder of Adaptive Testing Technologies, which distributes the CAT-MH battery of adaptive tests. She has an equity interest in Adaptive Testing Technologies Inc and in HealthRhythms Inc and is a founder and an employee of HealthRhythms Inc. She has received royalties from Guilford Press and the American Psychological Association Press. Dr Kupfer is a founder of Adaptive Testing Technologies, which distributes the CAT-MH battery of adaptive tests. He has an equity interest in Adaptive Testing Technologies Inc and in HealthRhythms Inc, of which he is a founder. He is a board member and holds an equity interest in Minerva Neuroscience and has received royalties for the Pittsburgh Sleep Quality Index from the University of Pittsburgh. Dr Gibbons is a founder of Adaptive Testing Technologies Inc, which distributes the CAT-MH battery of adaptive tests. The terms of this arrangement have been reviewed and approved by the University of Chicago in accordance with its conflict of interest policies. He has served as an expert witness regarding statistical issues for drug safety for Merck, Pfizer, GlaxoSmithKline, and the US Department of Justice.
                Funding/Support: Funding was provided by the Veterans Health Administration, Office of Mental Health and Suicide Prevention (National Institute of Mental Health grant RO1 MH100155-06).
                Role of the Funder/Sponsor: The funding organizations had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication. The VA Office of Mental Health and Suicide Prevention did not influence the decision to submit the manuscript for publication.
                Disclaimer: The views, opinions, and/or findings contained in this article are those of the authors and should not be construed as an official Department of Veterans Affairs position, policy, or decision unless so designated by other documentation.
                Additional Information: The POLYBIF program used is freely available at http://www.healthstats.org.
                Article
                zoi210472
                10.1001/jamanetworkopen.2021.15707
                8267606
                34236411
                2e3acae5-9c13-4206-af02-06a4bbfe59cd
                Copyright 2021 Brenner LA et al. JAMA Network Open.

                This is an open access article distributed under the terms of the CC-BY License.

                History
                : 23 December 2020
                : 20 April 2021
                Categories
                Research
                Original Investigation
                Online Only
                Psychiatry

                Comments

                Comment on this article

                scite_
                0
                0
                0
                0
                Smart Citations
                0
                0
                0
                0
                Citing PublicationsSupportingMentioningContrasting
                View Citations

                See how this article has been cited at scite.ai

                scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.

                Similar content47

                Cited by4

                Most referenced authors622