Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
13
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      Submit your digital health research with an established publisher
      - celebrating 25 years of open access

      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Exploring the Use of Evidence From the Development and Evaluation of an Electronic Health (eHealth) Trial: Case Study

      research-article
      , MSc 1 , , , MSc, PhD 1
      (Reviewer), (Reviewer)
      Journal of Medical Internet Research
      JMIR Publications
      evidence-based practice, evidence use, eHealth, evaluation, evaluation use

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          Evidence-based practice refers to building clinical decisions on credible research evidence, professional experience, and patient preferences. However, there is a growing concern that evidence in the context of electronic health (eHealth) is not sufficiently used when forming policies and practice of health care. In this context, using evaluation and research evidence in clinical or policy decisions dominates the discourse. However, the use of additional types of evidence, such as professional experience, is underexplored. Moreover, there might be other ways of using evidence than in clinical or policy decisions.

          Objective

          This study aimed to analyze how different types of evidence (such as evaluation outcomes [including patient preferences], professional experiences, and existing scientific evidence from other research) obtained within the development and evaluation of an eHealth trial are used by diverse stakeholders. An additional aim was to identify barriers to the use of evidence and ways to support its use.

          Methods

          This study was built on a case of an eHealth trial funded by the European Union. The project included 4 care centers, 2 research and development companies that provided the web-based physical exercise program and an activity monitoring device, and 2 science institutions. The qualitative data collection included 9 semistructured interviews conducted 8 months after the evaluation was concluded. The data analysis concerned (1) activities and decisions that were made based on evidence after the project ended, (2) evidence used for those activities and decisions, (3) in what way the evidence was used, and (4) barriers to the use of evidence.

          Results

          Evidence generated from eHealth trials can be used by various stakeholders for decisions regarding clinical integration of eHealth solutions, policy making, scientific publishing, research funding applications, eHealth technology, and teaching. Evaluation evidence has less value than professional experiences to local decision making regarding eHealth integration into clinical practice. Professional experiences constitute the evidence that is valuable to the highest variety of activities and decisions in relation to eHealth trials. When using existing scientific evidence related to eHealth trials, it is important to consider contextual relevance, such as location or disease. To support the use of evidence, it is suggested to create possibilities for health care professionals to gain experience, assess a few rather than a large number of variables, and design for shorter iterative cycles of evaluation.

          Conclusions

          Initiatives to support and standardize evidence-based practice in the context of eHealth should consider the complexities in how the evidence is used in order to achieve better uptake of evidence in practice. However, one should be aware that the assumption of fact-based decision making in organizations is misleading. In order to create better chances that the evidence produced would be used, this should be addressed through the design of eHealth trials.

          Related collections

          Most cited references26

          • Record: found
          • Abstract: found
          • Article: not found

          A model for assessment of telemedicine applications: mast.

          Telemedicine applications could potentially solve many of the challenges faced by the healthcare sectors in Europe. However, a framework for assessment of these technologies is need by decision makers to assist them in choosing the most efficient and cost-effective technologies. Therefore in 2009 the European Commission initiated the development of a framework for assessing telemedicine applications, based on the users' need for information for decision making. This article presents the Model for ASsessment of Telemedicine applications (MAST) developed in this study. MAST was developed through workshops with users and stakeholders of telemedicine. Based on the workshops and using the EUnetHTA Core HTA Model as a starting point a three-element model was developed, including: (i) preceding considerations, (ii) multidisciplinary assessment, and (iii) transferability assessment. In the multidisciplinary assessment, the outcomes of telemedicine applications comprise seven domains, based on the domains in the EUnetHTA model. MAST provides a structure for future assessment of telemedicine applications. MAST will be tested during 2010-13 in twenty studies of telemedicine applications in nine European countries in the EC project Renewing Health.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            How Can Research Keep Up With eHealth? Ten Strategies for Increasing the Timeliness and Usefulness of eHealth Research

            Background eHealth interventions appear and change so quickly that they challenge the way we conduct research. By the time a randomized trial of a new intervention is published, technological improvements and clinical discoveries may make the intervention dated and unappealing. This and the spate of health-related apps and websites may lead consumers, patients, and caregivers to use interventions that lack evidence of efficacy. Objective This paper aims to offer strategies for increasing the speed and usefulness of eHealth research. Methods The paper describes two types of strategies based on the authors’ own research and the research literature: those that improve the efficiency of eHealth research, and those that improve its quality. Results Efficiency strategies include: (1) think small: conduct small studies that can target discrete but significant questions and thereby speed knowledge acquisition; (2) use efficient designs: use such methods as fractional-factorial and quasi-experimental designs and surrogate endpoints, and experimentally modify and evaluate interventions and delivery systems already in use; (3) study universals: focus on timeless behavioral, psychological, and cognitive principles and systems; (4) anticipate the next big thing: listen to voices outside normal practice and connect different perspectives for new insights; (5) improve information delivery systems: researchers should apply their communications expertise to enhance inter-researcher communication, which could synergistically accelerate progress and capitalize upon the availability of “big data”; and (6) develop models, including mediators and moderators: valid models are remarkably generative, and tests of moderation and mediation should elucidate boundary conditions of effects and treatment mechanisms. Quality strategies include: (1) continuous quality improvement: researchers need to borrow engineering practices such as the continuous enhancement of interventions to incorporate clinical and technological progress; (2) help consumers identify quality: consumers, clinicians, and others all need to easily identify quality, suggesting the need to efficiently and publicly index intervention quality; (3) reduce the costs of care: concern with health care costs can drive intervention adoption and use and lead to novel intervention effects (eg, reduced falls in the elderly); and (4) deeply understand users: a rigorous evaluation of the consumer’s needs is a key starting point for intervention development. Conclusions The challenges of distinguishing and distributing scientifically validated interventions are formidable. The strategies described are meant to spur discussion and further thinking, which are important, given the potential of eHealth interventions to help patients and families.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              eHealth evaluation and dissemination research.

              This paper reviews key challenges in evaluating eHealth intervention and behavior change programs, and makes recommendations for the types of designs, measures, and methods needed to accelerate the integration of proven eHealth programs into practice. Key issues discussed include evaluation approaches that answer questions that consumers, potential adoptees, and policymakers have. These include measures of participation and representativeness at both patient and healthcare setting levels, consistency of outcomes across different subgroups, tendency of an eHealth program to ameliorate versus exacerbate health disparities, implementation and program adaptation, cost, and quality-of-life outcomes. More practical eHealth trials are needed that use rigorous but creative designs compatible with eHealth interventions and theory. These evaluations should address key dissemination issues, such as appeal, use, and robustness of eHealth programs across different subgroups, settings, conditions, outcomes, and time.
                Bookmark

                Author and article information

                Contributors
                Journal
                J Med Internet Res
                J. Med. Internet Res
                JMIR
                Journal of Medical Internet Research
                JMIR Publications (Toronto, Canada )
                1439-4456
                1438-8871
                August 2020
                28 August 2020
                : 22
                : 8
                : e17718
                Affiliations
                [1 ] Centre for Healthcare Improvement Chalmers University of Technology Gothenburg Sweden
                Author notes
                Corresponding Author: Monika Jurkeviciute monika.jurkeviciute@ 123456chalmers.se
                Author information
                https://orcid.org/0000-0001-9824-3649
                https://orcid.org/0000-0001-6464-7231
                Article
                v22i8e17718
                10.2196/17718
                7486667
                32857057
                b1680443-0b13-460b-9975-1cdbc5b0af2f
                ©Monika Jurkeviciute, Henrik Eriksson. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 28.08.2020.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.

                History
                : 7 January 2020
                : 17 March 2020
                : 21 May 2020
                : 13 June 2020
                Categories
                Original Paper
                Original Paper

                Medicine
                evidence-based practice,evidence use,ehealth,evaluation,evaluation use
                Medicine
                evidence-based practice, evidence use, ehealth, evaluation, evaluation use

                Comments

                Comment on this article