0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Electronic health record data quality assessment and tools: a systematic review

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Objective

          We extended a 2013 literature review on electronic health record (EHR) data quality assessment approaches and tools to determine recent improvements or changes in EHR data quality assessment methodologies.

          Materials and Methods

          We completed a systematic review of PubMed articles from 2013 to April 2023 that discussed the quality assessment of EHR data. We screened and reviewed papers for the dimensions and methods defined in the original 2013 manuscript. We categorized papers as data quality outcomes of interest, tools, or opinion pieces. We abstracted and defined additional themes and methods though an iterative review process.

          Results

          We included 103 papers in the review, of which 73 were data quality outcomes of interest papers, 22 were tools, and 8 were opinion pieces. The most common dimension of data quality assessed was completeness, followed by correctness, concordance, plausibility, and currency. We abstracted conformance and bias as 2 additional dimensions of data quality and structural agreement as an additional methodology.

          Discussion

          There has been an increase in EHR data quality assessment publications since the original 2013 review. Consistent dimensions of EHR data quality continue to be assessed across applications. Despite consistent patterns of assessment, there still does not exist a standard approach for assessing EHR data quality.

          Conclusion

          Guidelines are needed for EHR data quality assessment to improve the efficiency, transparency, comparability, and interoperability of data quality assessment. These guidelines must be both scalable and flexible. Automation could be helpful in generalizing this process.

          Related collections

          Most cited references123

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          The PRISMA 2020 statement: an updated guideline for reporting systematic reviews

          The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement, published in 2009, was designed to help systematic reviewers transparently report why the review was done, what the authors did, and what they found. Over the past decade, advances in systematic review methodology and terminology have necessitated an update to the guideline. The PRISMA 2020 statement replaces the 2009 statement and includes new reporting guidance that reflects advances in methods to identify, select, appraise, and synthesise studies. The structure and presentation of the items have been modified to facilitate implementation. In this article, we present the PRISMA 2020 27-item checklist, an expanded checklist that details reporting recommendations for each item, the PRISMA 2020 abstract checklist, and the revised flow diagrams for original and updated reviews.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Methods and dimensions of electronic health record data quality assessment: enabling reuse for clinical research

            Objective To review the methods and dimensions of data quality assessment in the context of electronic health record (EHR) data reuse for research. Materials and methods A review of the clinical research literature discussing data quality assessment methodology for EHR data was performed. Using an iterative process, the aspects of data quality being measured were abstracted and categorized, as well as the methods of assessment used. Results Five dimensions of data quality were identified, which are completeness, correctness, concordance, plausibility, and currency, and seven broad categories of data quality assessment methods: comparison with gold standards, data element agreement, data source agreement, distribution comparison, validity checks, log review, and element presence. Discussion Examination of the methods by which clinical researchers have investigated the quality and suitability of EHR data for research shows that there are fundamental features of data quality, which may be difficult to measure, as well as proxy dimensions. Researchers interested in the reuse of EHR data for clinical research are recommended to consider the adoption of a consistent taxonomy of EHR data quality, to remain aware of the task-dependence of data quality, to integrate work on data quality assessment from other fields, and to adopt systematic, empirically driven, statistically based methods of data quality assessment. Conclusion There is currently little consistency or potential generalizability in the methods used to assess EHR data quality. If the reuse of EHR data for clinical research is to become accepted, researchers should adopt validated, systematic methods of EHR data quality assessment.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Observational Health Data Sciences and Informatics (OHDSI): Opportunities for Observational Researchers.

              The vision of creating accessible, reliable clinical evidence by accessing the clincial experience of hundreds of millions of patients across the globe is a reality. Observational Health Data Sciences and Informatics (OHDSI) has built on learnings from the Observational Medical Outcomes Partnership to turn methods research and insights into a suite of applications and exploration tools that move the field closer to the ultimate goal of generating evidence about all aspects of healthcare to serve the needs of patients, clinicians and all other decision-makers around the world.
                Bookmark

                Author and article information

                Contributors
                Journal
                J Am Med Inform Assoc
                J Am Med Inform Assoc
                jamia
                Journal of the American Medical Informatics Association : JAMIA
                Oxford University Press
                1067-5027
                1527-974X
                October 2023
                30 June 2023
                30 June 2023
                : 30
                : 10
                : 1730-1740
                Affiliations
                Division of Computational and Data Sciences, Washington University in St. Louis , St. Louis, Missouri, USA
                Institute for Informatics, Data Science and Biostatistics, Washington University in St. Louis , St. Louis, Missouri, USA
                Department of Medical Informatics and Clinical Epidemiology, Oregon Health & Science University , Portland, Oregon, USA
                Institute for Informatics, Data Science and Biostatistics, Washington University in St. Louis , St. Louis, Missouri, USA
                Institute for Informatics, Data Science and Biostatistics, Washington University in St. Louis , St. Louis, Missouri, USA
                Institute for Informatics, Data Science and Biostatistics, Washington University in St. Louis , St. Louis, Missouri, USA
                Institute for Informatics, Data Science and Biostatistics, Washington University in St. Louis , St. Louis, Missouri, USA
                Institute for Informatics, Data Science and Biostatistics, Washington University in St. Louis , St. Louis, Missouri, USA
                Author notes
                Corresponding Author: Aditi Gupta, PhD, Institute for Informatics, Data Science and Biostatistics, Washington University in St. Louis, 660 S. Euclid Ave., Campus Box 8132, Saint Louis, MO 63110, USA; agupta24@ 123456wustl.edu
                Author information
                https://orcid.org/0000-0003-0365-909X
                https://orcid.org/0000-0001-5219-9996
                https://orcid.org/0000-0001-9255-9394
                https://orcid.org/0000-0002-9241-2656
                https://orcid.org/0000-0002-9532-2998
                Article
                ocad120
                10.1093/jamia/ocad120
                10531113
                37390812
                56edee40-64fc-4553-bd2a-07ee541dba5f
                © The Author(s) 2023. Published by Oxford University Press on behalf of the American Medical Informatics Association.

                This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial License ( https://creativecommons.org/licenses/by-nc/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited. For commercial re-use, please contact journals.permissions@oup.com

                History
                : 07 February 2023
                : 16 May 2023
                : 07 June 2023
                : 23 June 2023
                Page count
                Pages: 11
                Categories
                Review
                AcademicSubjects/MED00580
                AcademicSubjects/SCI01060
                AcademicSubjects/SCI01530

                Bioinformatics & Computational biology
                clinical research informatics,data quality,electronic health records

                Comments

                Comment on this article