6
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      ‘It depends’: what 86 systematic reviews tell us about what strategies to use to support the use of research in clinical practice

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          The gap between research findings and clinical practice is well documented and a range of strategies have been developed to support the implementation of research into clinical practice. The objective of this study was to update and extend two previous reviews of systematic reviews of strategies designed to implement research evidence into clinical practice.

          Methods

          We developed a comprehensive systematic literature search strategy based on the terms used in the previous reviews to identify studies that looked explicitly at interventions designed to turn research evidence into practice. The search was performed in June 2022 in four electronic databases: Medline, Embase, Cochrane and Epistemonikos. We searched from January 2010 up to June 2022 and applied no language restrictions. Two independent reviewers appraised the quality of included studies using a quality assessment checklist. To reduce the risk of bias, papers were excluded following discussion between all members of the team. Data were synthesised using descriptive and narrative techniques to identify themes and patterns linked to intervention strategies, targeted behaviours, study settings and study outcomes.

          Results

          We identified 32 reviews conducted between 2010 and 2022. The reviews are mainly of multi-faceted interventions ( n = 20) although there are reviews focusing on single strategies (ICT, educational, reminders, local opinion leaders, audit and feedback, social media and toolkits). The majority of reviews report strategies achieving small impacts (normally on processes of care). There is much less evidence that these strategies have shifted patient outcomes. Furthermore, a lot of nuance lies behind these headline findings, and this is increasingly commented upon in the reviews themselves.

          Discussion

          Combined with the two previous reviews, 86 systematic reviews of strategies to increase the implementation of research into clinical practice have been identified. We need to shift the emphasis away from isolating individual and multi-faceted interventions to better understanding and building more situated, relational and organisational capability to support the use of research in clinical practice. This will involve drawing on a wider range of research perspectives (including social science) in primary studies and diversifying the types of synthesis undertaken to include approaches such as realist synthesis which facilitate exploration of the context in which strategies are employed.

          Supplementary Information

          The online version contains supplementary material available at 10.1186/s13012-024-01337-z.

          Related collections

          Most cited references54

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project

          Background Identifying, developing, and testing implementation strategies are important goals of implementation science. However, these efforts have been complicated by the use of inconsistent language and inadequate descriptions of implementation strategies in the literature. The Expert Recommendations for Implementing Change (ERIC) study aimed to refine a published compilation of implementation strategy terms and definitions by systematically gathering input from a wide range of stakeholders with expertise in implementation science and clinical practice. Methods Purposive sampling was used to recruit a panel of experts in implementation and clinical practice who engaged in three rounds of a modified Delphi process to generate consensus on implementation strategies and definitions. The first and second rounds involved Web-based surveys soliciting comments on implementation strategy terms and definitions. After each round, iterative refinements were made based upon participant feedback. The third round involved a live polling and consensus process via a Web-based platform and conference call. Results Participants identified substantial concerns with 31% of the terms and/or definitions and suggested five additional strategies. Seventy-five percent of definitions from the originally published compilation of strategies were retained after voting. Ultimately, the expert panel reached consensus on a final compilation of 73 implementation strategies. Conclusions This research advances the field by improving the conceptual clarity, relevance, and comprehensiveness of implementation strategies that can be used in isolation or combination in implementation research and practice. Future phases of ERIC will focus on developing conceptually distinct categories of strategies as well as ratings for each strategy’s importance and feasibility. Next, the expert panel will recommend multifaceted strategies for hypothetical yet real-world scenarios that vary by sites’ endorsement of evidence-based programs and practices and the strength of contextual supports that surround the effort. Electronic supplementary material The online version of this article (doi:10.1186/s13012-015-0209-1) contains supplementary material, which is available to authorized users.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Realist review--a new method of systematic review designed for complex policy interventions.

            Evidence-based policy is a dominant theme in contemporary public services but the practical realities and challenges involved in using evidence in policy-making are formidable. Part of the problem is one of complexity. In health services and other public services, we are dealing with complex social interventions which act on complex social systems--things like league tables, performance measures, regulation and inspection, or funding reforms. These are not 'magic bullets' which will always hit their target, but programmes whose effects are crucially dependent on context and implementation. Traditional methods of review focus on measuring and reporting on programme effectiveness, often find that the evidence is mixed or conflicting, and provide little or no clue as to why the intervention worked or did not work when applied in different contexts or circumstances, deployed by different stakeholders, or used for different purposes. This paper offers a model of research synthesis which is designed to work with complex social interventions or programmes, and which is based on the emerging 'realist' approach to evaluation. It provides an explanatory analysis aimed at discerning what works for whom, in what circumstances, in what respects and how. The first step is to make explicit the programme theory (or theories)--the underlying assumptions about how an intervention is meant to work and what impacts it is expected to have. We then look for empirical evidence to populate this theoretical framework, supporting, contradicting or modifying the programme theories as it goes. The results of the review combine theoretical understanding and empirical evidence, and focus on explaining the relationship between the context in which the intervention is applied, the mechanisms by which it works and the outcomes which are produced. The aim is to enable decision-makers to reach a deeper understanding of the intervention and how it can be made to work most effectively. Realist review does not provide simple answers to complex questions. It will not tell policy-makers or managers whether something works or not, but will provide the policy and practice community with the kind of rich, detailed and highly practical understanding of complex social interventions which is likely to be of much more use to them when planning and implementing programmes at a national, regional or local level.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Welcome to Implementation Science

              Implementation research is the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, and, hence, to improve the quality and effectiveness of health services and care. This relatively new field includes the study of influences on healthcare professional and organisational behaviour. Implementation Science will encompass all aspects of research in this field, in clinical, community and policy contexts. This online journal will provide a unique platform for this type of research and will publish a broad range of articles – study protocols, debate, theoretical and conceptual articles, rigorous evaluations of the process of change, and articles on methodology and rigorously developed tools – that will enhance the development and refinement of implementation research. No one discipline, research design, or paradigm will be favoured. Implementation Science looks forward to receiving manuscripts that facilitate the continued development of the field, and contribute to healthcare policy and practice.
                Bookmark

                Author and article information

                Contributors
                Annette.Boaz@kcl.ac.uk
                Journal
                Implement Sci
                Implement Sci
                Implementation Science : IS
                BioMed Central (London )
                1748-5908
                19 February 2024
                19 February 2024
                2024
                : 19
                : 15
                Affiliations
                [1 ]Health and Social Care Workforce Research Unit, The Policy Institute, King’s College London, ( https://ror.org/0220mzb33) Virginia Woolf Building, 22 Kingsway, London, WC2B 6LE UK
                [2 ]King’s Business School, King’s College London, ( https://ror.org/0220mzb33) 30 Aldwych, London, WC2B 4BG UK
                [3 ]Federal University of Santa Catarina (UFSC), Campus Universitário Reitor João Davi Ferreira Lima, ( https://ror.org/041akq887) Florianópolis, SC 88.040-900 Brazil
                Author information
                http://orcid.org/0000-0003-0557-1294
                http://orcid.org/0000-0003-1121-1551
                Article
                1337
                10.1186/s13012-024-01337-z
                10875780
                38374051
                3cbc6209-b9f3-4e68-b016-c6375564ffed
                © The Author(s) 2024

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

                History
                : 1 November 2023
                : 5 January 2024
                Funding
                Funded by: FundRef http://dx.doi.org/10.13039/501100000272, National Institute for Health and Care Research;
                Award ID: NIHR200152
                Categories
                Systematic Review
                Custom metadata
                © BioMed Central Ltd., part of Springer Nature 2024

                Medicine
                systematic review,implementation,strategies,interventions,clinical practice,research evidence,single,multi-faceted

                Comments

                Comment on this article

                scite_
                0
                0
                0
                0
                Smart Citations
                0
                0
                0
                0
                Citing PublicationsSupportingMentioningContrasting
                View Citations

                See how this article has been cited at scite.ai

                scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.

                Similar content178

                Cited by6

                Most referenced authors754