22
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Harmonizing evidence-based practice, implementation context, and implementation strategies with user-centered design: a case example in young adult cancer care

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          Attempting to implement evidence-based practices in contexts for which they are not well suited may compromise their fidelity and effectiveness or burden users (e.g., patients, providers, healthcare organizations) with elaborate strategies intended to force implementation. To improve the fit between evidence-based practices and contexts, implementation science experts have called for methods for adapting evidence-based practices and contexts and tailoring implementation strategies; yet, methods for considering the dynamic interplay among evidence-based practices, contexts, and implementation strategies remain lacking. We argue that harmonizing the three can be facilitated by user-centered design, an iterative and highly stakeholder-engaged set of principles and methods.

          Methods

          This paper presents a case example in which we used a three-phase user-centered design process to design and plan to implement a care coordination intervention for young adults with cancer. Specifically, we used usability testing to redesign and augment an existing patient-reported outcome measure that served as the basis for our intervention to optimize its usability and usefulness, ethnographic contextual inquiry to prepare the context (i.e., a comprehensive cancer center) to promote receptivity to implementation, and iterative prototyping workshops with a multidisciplinary design team to design the care coordination intervention and anticipate implementation strategies needed to enhance contextual fit.

          Results

          Our user-centered design process resulted in the Young Adult Needs Assessment and Service Bridge (NA-SB), including a patient-reported outcome measure and a collection of referral pathways that are triggered by the needs young adults report, as well as implementation guidance. By ensuring NA-SB directly responded to features of users and context, we designed NA-SB for implementation, potentially minimizing the strategies needed to address misalignment that may have otherwise existed. Furthermore, we designed NA-SB for scale-up; by engaging users from other cancer programs across the country to identify points of contextual variation which would require flexibility in delivery, we created a tool intended to accommodate diverse contexts.

          Conclusions

          User-centered design can help maximize usability and usefulness when designing evidence-based practices, preparing contexts, and informing implementation strategies—in effect, harmonizing evidence-based practices, contexts, and implementation strategies to promote implementation and effectiveness.

          Supplementary Information

          The online version contains supplementary material available at 10.1186/s43058-021-00147-4.

          Related collections

          Most cited references77

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science

          Background Many interventions found to be effective in health services research studies fail to translate into meaningful patient care outcomes across multiple contexts. Health services researchers recognize the need to evaluate not only summative outcomes but also formative outcomes to assess the extent to which implementation is effective in a specific setting, prolongs sustainability, and promotes dissemination into other settings. Many implementation theories have been published to help promote effective implementation. However, they overlap considerably in the constructs included in individual theories, and a comparison of theories reveals that each is missing important constructs included in other theories. In addition, terminology and definitions are not consistent across theories. We describe the Consolidated Framework For Implementation Research (CFIR) that offers an overarching typology to promote implementation theory development and verification about what works where and why across multiple contexts. Methods We used a snowball sampling approach to identify published theories that were evaluated to identify constructs based on strength of conceptual or empirical support for influence on implementation, consistency in definitions, alignment with our own findings, and potential for measurement. We combined constructs across published theories that had different labels but were redundant or overlapping in definition, and we parsed apart constructs that conflated underlying concepts. Results The CFIR is composed of five major domains: intervention characteristics, outer setting, inner setting, characteristics of the individuals involved, and the process of implementation. Eight constructs were identified related to the intervention (e.g., evidence strength and quality), four constructs were identified related to outer setting (e.g., patient needs and resources), 12 constructs were identified related to inner setting (e.g., culture, leadership engagement), five constructs were identified related to individual characteristics, and eight constructs were identified related to process (e.g., plan, evaluate, and reflect). We present explicit definitions for each construct. Conclusion The CFIR provides a pragmatic structure for approaching complex, interacting, multi-level, and transient states of constructs in the real world by embracing, consolidating, and unifying key constructs from published implementation theories. It can be used to guide formative evaluations and build the implementation knowledge base across multiple studies and settings.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Outcomes for Implementation Research: Conceptual Distinctions, Measurement Challenges, and Research Agenda

            An unresolved issue in the field of implementation research is how to conceptualize and evaluate successful implementation. This paper advances the concept of “implementation outcomes” distinct from service system and clinical treatment outcomes. This paper proposes a heuristic, working “taxonomy” of eight conceptually distinct implementation outcomes—acceptability, adoption, appropriateness, feasibility, fidelity, implementation cost, penetration, and sustainability—along with their nominal definitions. We propose a two-pronged agenda for research on implementation outcomes. Conceptualizing and measuring implementation outcomes will advance understanding of implementation processes, enhance efficiency in implementation research, and pave the way for studies of the comparative effectiveness of implementation strategies.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project

              Background Identifying, developing, and testing implementation strategies are important goals of implementation science. However, these efforts have been complicated by the use of inconsistent language and inadequate descriptions of implementation strategies in the literature. The Expert Recommendations for Implementing Change (ERIC) study aimed to refine a published compilation of implementation strategy terms and definitions by systematically gathering input from a wide range of stakeholders with expertise in implementation science and clinical practice. Methods Purposive sampling was used to recruit a panel of experts in implementation and clinical practice who engaged in three rounds of a modified Delphi process to generate consensus on implementation strategies and definitions. The first and second rounds involved Web-based surveys soliciting comments on implementation strategy terms and definitions. After each round, iterative refinements were made based upon participant feedback. The third round involved a live polling and consensus process via a Web-based platform and conference call. Results Participants identified substantial concerns with 31% of the terms and/or definitions and suggested five additional strategies. Seventy-five percent of definitions from the originally published compilation of strategies were retained after voting. Ultimately, the expert panel reached consensus on a final compilation of 73 implementation strategies. Conclusions This research advances the field by improving the conceptual clarity, relevance, and comprehensiveness of implementation strategies that can be used in isolation or combination in implementation research and practice. Future phases of ERIC will focus on developing conceptually distinct categories of strategies as well as ratings for each strategy’s importance and feasibility. Next, the expert panel will recommend multifaceted strategies for hypothetical yet real-world scenarios that vary by sites’ endorsement of evidence-based programs and practices and the strength of contextual supports that surround the effort. Electronic supplementary material The online version of this article (doi:10.1186/s13012-015-0209-1) contains supplementary material, which is available to authorized users.
                Bookmark

                Author and article information

                Contributors
                ehaines@wakehealth.edu
                adopp@rand.org
                lyona@uw.edu
                Holly.witteman@fmed.ulaval.ca
                miriamb@uci.edu
                gratianne.vaisson.1@ulaval.ca
                dani.hitch@deakin.edu.au
                sbirken@wakehealth.edu
                Journal
                Implement Sci Commun
                Implement Sci Commun
                Implementation Science Communications
                BioMed Central (London )
                2662-2211
                26 April 2021
                26 April 2021
                2021
                : 2
                : 45
                Affiliations
                [1 ]GRID grid.241167.7, ISNI 0000 0001 2185 3318, Department of Social Sciences and Health Policy, , Wake Forest School of Medicine, ; 525 Vine Street, Winston-Salem, NC 27101 USA
                [2 ]GRID grid.34474.30, ISNI 0000 0004 0370 7685, Department of Behavioral and Policy Sciences, , RAND Corporation, ; 1776 Main St, Santa Monica, CA 90401 USA
                [3 ]GRID grid.34477.33, ISNI 0000000122986657, Psychiatry and Behavioral Sciences, , University of Washington, ; 6200 NE 74th Street, Suite 100, Seattle, WA 98115 USA
                [4 ]GRID grid.23856.3a, ISNI 0000 0004 1936 8390, Department of Family and Emergency Medicine, Faculty of Medicine, , Laval University, ; Ferdinand Vandry Pavillon, 1050 Avenue de la Médecine,, Quebec City, QC G1V 0A6 Canada
                [5 ]GRID grid.266093.8, ISNI 0000 0001 0668 7243, Sue & Bill Gross School of Nursing, , University of California, Irvine, ; 252C Berk Hall, Irvine, CA 92697-3959 USA
                [6 ]GRID grid.23856.3a, ISNI 0000 0004 1936 8390, Occupational Therapy, Faculty of Medicine, , Laval University, ; Ferdinand Vandry Pavillon, 1050 Avenue de la Médecine, Quebec City, QC G1V 0A6 Canada
                [7 ]GRID grid.1021.2, ISNI 0000 0001 0526 7079, Department of Physical Activity and Nutrition Research, School of Health and Social Development, , Deakin University, Waterfront Campus, ; 1 Gheringhap Street, Geelong, VIC 3220 Australia
                [8 ]GRID grid.241167.7, ISNI 0000 0001 2185 3318, Department of Implementation Science, , Wake Forest School of Medicine, ; 525@Vine Room 5219, Medical Center Boulevard, Winston-Salem, NC 27157 USA
                Author information
                http://orcid.org/0000-0001-7208-5937
                Article
                147
                10.1186/s43058-021-00147-4
                8077816
                33902748
                d83528dc-9038-465a-b1ed-5252fcc5d5d0
                © The Author(s) 2021

                Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

                History
                : 7 August 2020
                : 11 April 2021
                Funding
                Funded by: FundRef http://dx.doi.org/10.13039/100000054, National Cancer Institute;
                Award ID: 2T32 CA122061
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/100000025, National Institute of Mental Health;
                Award ID: 5R25MH08091607
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/100006108, National Center for Advancing Translational Sciences;
                Award ID: KL2TR002490
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/501100002784, Canada Excellence Research Chairs, Government of Canada;
                Award ID: Human-Centred Digital Health
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/100008615, Lineberger Comprehensive Cancer Center, University of North Carolina at Chapel Hill;
                Award ID: University Cancer Research Fund
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/100007181, Quality Enhancement Research Initiative;
                Categories
                Methodology
                Custom metadata
                © The Author(s) 2021

                user-centered design,human-centered design,context,evidence-based practice implementation,designing implementation strategies,contextual appropriateness,ebp redesign,stakeholder engagement,adaptation

                Comments

                Comment on this article