11
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Toward a Learning Health Care System: A Systematic Review and Evidence-Based Conceptual Framework for Implementation of Clinical Analytics in a Digital Hospital

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Objective  A learning health care system (LHS) uses routinely collected data to continuously monitor and improve health care outcomes. Little is reported on the challenges and methods used to implement the analytics underpinning an LHS. Our aim was to systematically review the literature for reports of real-time clinical analytics implementation in digital hospitals and to use these findings to synthesize a conceptual framework for LHS implementation.

          Methods  Embase, PubMed, and Web of Science databases were searched for clinical analytics derived from electronic health records in adult inpatient and emergency department settings between 2015 and 2021. Evidence was coded from the final study selection that related to (1) dashboard implementation challenges, (2) methods to overcome implementation challenges, and (3) dashboard assessment and impact. The evidences obtained, together with evidence extracted from relevant prior reviews, were mapped to an existing digital health transformation model to derive a conceptual framework for LHS analytics implementation.

          Results  A total of 238 candidate articles were reviewed and 14 met inclusion criteria. From the selected studies, we extracted 37 implementation challenges and 64 methods employed to overcome such challenges. We identified common approaches for evaluating the implementation of clinical dashboards. Six studies assessed clinical process outcomes and only four studies evaluated patient health outcomes. A conceptual framework for implementing the analytics of an LHS was developed.

          Conclusion  Health care organizations face diverse challenges when trying to implement real-time data analytics. These challenges have shifted over the past decade. While prior reviews identified fundamental information problems, such as data size and complexity, our review uncovered more postpilot challenges, such as supporting diverse users, workflows, and user-interface screens. Our review identified practical methods to overcome these challenges which have been incorporated into a conceptual framework. It is hoped this framework will support health care organizations deploying near-real-time clinical dashboards and progress toward an LHS.

          Related collections

          Most cited references36

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice

          Background The Promoting Action on Research Implementation in Health Services, or PARIHS framework, was first published in 1998. Since this time, work has been ongoing to further develop, refine and test it. Widely used as an organising or conceptual framework to help both explain and predict why the implementation of evidence into practice is or is not successful, PARIHS was one of the first frameworks to make explicit the multi-dimensional and complex nature of implementation as well as highlighting the central importance of context. Several critiques of the framework have also pointed out its limitations and suggested areas for improvement. Discussion Building on the published critiques and a number of empirical studies, this paper introduces a revised version of the framework, called the integrated or i-PARIHS framework. The theoretical antecedents of the framework are described as well as outlining the revised and new elements, notably, the revision of how evidence is described; how the individual and teams are incorporated; and how context is further delineated. We describe how the framework can be operationalised and draw on case study data to demonstrate the preliminary testing of the face and content validity of the revised framework. Summary This paper is presented for deliberation and discussion within the implementation science community. Responding to a series of critiques and helpful feedback on the utility of the original PARIHS framework, we seek feedback on the proposed improvements to the framework. We believe that the i-PARIHS framework creates a more integrated approach to understand the theoretical complexity from which implementation science draws its propositions and working hypotheses; that the new framework is more coherent and comprehensive and at the same time maintains it intuitive appeal; and that the models of facilitation described enable its more effective operationalisation. Electronic supplementary material The online version of this article (doi:10.1186/s13012-016-0398-2) contains supplementary material, which is available to authorized users.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Reviewing studies with diverse designs: the development and evaluation of a new tool.

            RATIONALE, AIMS & OBJECTIVE: Tools for the assessment of the quality of research studies tend to be specific to a particular research design (e.g. randomized controlled trials, or qualitative interviews). This makes it difficult to assess the quality of a body of research that addresses the same or a similar research question but using different approaches. The aim of this paper is to describe the development and preliminary evaluation of a quality assessment tool that can be applied to a methodologically diverse set of research articles. The 16-item quality assessment tool (QATSDD) was assessed to determine its reliability and validity when used by health services researchers in the disciplines of psychology, sociology and nursing. Qualitative feedback was also gathered from mixed-methods health researchers regarding the comprehension, content, perceived value and usability of the tool. Reference to existing widely used quality assessment tools and experts in systematic review confirmed that the components of the tool represented the construct of 'good research technique' being assessed. Face validity was subsequently established through feedback from a sample of nine health researchers. Inter-rater reliability was established through substantial agreement between three reviewers when applying the tool to a set of three research papers (κ = 71.5%), and good to substantial agreement between their scores at time 1 and after a 6-week interval at time 2 confirmed test-retest reliability. The QATSDD shows good reliability and validity for use in the quality assessment of a diversity of studies, and may be an extremely useful tool for reviewers to standardize and increase the rigour of their assessments in reviews of the published papers which include qualitative and quantitative work. © 2011 Blackwell Publishing Ltd.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Dashboards for improving patient care: review of the literature.

              This review aimed to provide a comprehensive overview of the current state of evidence for the use of clinical and quality dashboards in health care environments.
                Bookmark

                Author and article information

                Journal
                Appl Clin Inform
                Appl Clin Inform
                10.1055/s-00035026
                Applied Clinical Informatics
                Georg Thieme Verlag KG (Rüdigerstraße 14, 70469 Stuttgart, Germany )
                1869-0327
                06 April 2022
                March 2022
                1 April 2022
                : 13
                : 2
                : 339-354
                Affiliations
                [1 ]Centre for Health Services Research, Faculty of Medicine, The University of Queensland, Herston, Brisbane, Australia
                [2 ]Department of Health, eHealth Queensland, Queensland Government, Brisbane, Australia
                [3 ]Information Engineering Lab, School of Information Technology and Electrical Engineering, The University of Queensland, St Lucia, Brisbane, Australia
                [4 ]Digital Health Cooperative Research Centre, Australian Government, Sydney, New South Wales, Australia
                [5 ]UQ Business School, Faculty of Business, Economics and Law, The University of Queensland, St. Lucia, Brisbane, Australia
                [6 ]School of Pharmacy, Faculty of Health and Behavioural Sciences, The University of Queensland, PACE Precinct, Woolloongabba, Brisbane, Australia
                [7 ]Pharmacy Department, Princess Alexandra Hospital, Woolloongabba, Brisbane, Australia
                [8 ]School of Mathematics and Physics, Faculty of Science, The University of Queensland, St Lucia, Brisbane, Australia
                [9 ]Department of Health, Metro North Hospital and Health Service, Queensland Government, Herston QLD, Australia
                Author notes
                Address for correspondence Jodie A. Austin, BPharm, PGDipClinPharm Centre for Health Services Research, Faculty of Medicine, The University of Queensland Herston, Brisbane 4006Australia j.austin1@ 123456uq.edu.au
                Article
                210202r
                10.1055/s-0042-1743243
                8986462
                35388447
                d4575e49-9401-4bd0-9baa-48986319be78
                The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. ( https://creativecommons.org/licenses/by-nc-nd/4.0/ )

                This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License, which permits unrestricted reproduction and distribution, for non-commercial purposes only; and use and reproduction, but not distribution, of adapted material for non-commercial purposes only, provided the original work is properly cited.

                History
                : 29 July 2021
                : 09 January 2022
                Funding
                Funded by: Digital Health CRC
                Award ID: STARS 0034
                Funding This study was funded by the Digital Health CRC, grant no.: STARS 0034.
                Categories
                Review Article

                learning health care system,electronic health records and systems,clinical decision support,hospital information systems,clinical data management,dashboard,digital hospital

                Comments

                Comment on this article