51
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      Submit your digital health research with an established publisher
      - celebrating 25 years of open access

      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Enlight: A Comprehensive Quality and Therapeutic Potential Evaluation Tool for Mobile and Web-Based eHealth Interventions

      research-article

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          Studies of criteria-based assessment tools have demonstrated the feasibility of objectively evaluating eHealth interventions independent of empirical testing. However, current tools have not included some quality constructs associated with intervention outcome, such as persuasive design, behavior change, or therapeutic alliance. In addition, the generalizability of such tools has not been explicitly examined.

          Objective

          The aim is to introduce the development and further analysis of the Enlight suite of measures, developed to incorporate the aforementioned concepts and address generalizability aspects.

          Methods

          As a first step, a comprehensive systematic review was performed to identify relevant quality rating criteria in line with the PRISMA statement. These criteria were then categorized to create Enlight. The second step involved testing Enlight on 42 mobile apps and 42 Web-based programs (delivery mediums) targeting modifiable behaviors related to medical illness or mental health (clinical aims).

          Results

          A total of 476 criteria from 99 identified sources were used to build Enlight. The rating measures were divided into two sections: quality assessments and checklists. Quality assessments included usability, visual design, user engagement, content, therapeutic persuasiveness, therapeutic alliance, and general subjective evaluation. The checklists included credibility, privacy explanation, basic security, and evidence-based program ranking. The quality constructs exhibited excellent interrater reliability (intraclass correlations=.77-.98, median .91) and internal consistency (Cronbach alphas=.83-.90, median .88), with similar results when separated into delivery mediums or clinical aims. Conditional probability analysis revealed that 100% of the programs that received a score of fair or above (≥3.0) in therapeutic persuasiveness or therapeutic alliance received the same range of scores in user engagement and content—a pattern that did not appear in the opposite direction. Preliminary concurrent validity analysis pointed to positive correlations of combined quality scores with selected variables. The combined score that did not include therapeutic persuasiveness and therapeutic alliance descriptively underperformed the other combined scores.

          Conclusions

          This paper provides empirical evidence supporting the importance of persuasive design and therapeutic alliance within the context of a program’s evaluation. Reliability metrics and preliminary concurrent validity analysis indicate the potential of Enlight in examining eHealth programs regardless of delivery mediums and clinical aims.

          Related collections

          Most cited references46

          • Record: found
          • Abstract: not found
          • Article: not found

          Controlling the False Discovery Rate: A Practical and Powerful Approach to Multiple Testing

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Coefficient alpha and the internal structure of tests

            Psychometrika, 16(3), 297-334
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Persuasive System Design Does Matter: A Systematic Review of Adherence to Web-Based Interventions

              Background Although web-based interventions for promoting health and health-related behavior can be effective, poor adherence is a common issue that needs to be addressed. Technology as a means to communicate the content in web-based interventions has been neglected in research. Indeed, technology is often seen as a black-box, a mere tool that has no effect or value and serves only as a vehicle to deliver intervention content. In this paper we examine technology from a holistic perspective. We see it as a vital and inseparable aspect of web-based interventions to help explain and understand adherence. Objective This study aims to review the literature on web-based health interventions to investigate whether intervention characteristics and persuasive design affect adherence to a web-based intervention. Methods We conducted a systematic review of studies into web-based health interventions. Per intervention, intervention characteristics, persuasive technology elements and adherence were coded. We performed a multiple regression analysis to investigate whether these variables could predict adherence. Results We included 101 articles on 83 interventions. The typical web-based intervention is meant to be used once a week, is modular in set-up, is updated once a week, lasts for 10 weeks, includes interaction with the system and a counselor and peers on the web, includes some persuasive technology elements, and about 50% of the participants adhere to the intervention. Regarding persuasive technology, we see that primary task support elements are most commonly employed (mean 2.9 out of a possible 7.0). Dialogue support and social support are less commonly employed (mean 1.5 and 1.2 out of a possible 7.0, respectively). When comparing the interventions of the different health care areas, we find significant differences in intended usage (p = .004), setup (p < .001), updates (p < .001), frequency of interaction with a counselor (p < .001), the system (p = .003) and peers (p = .017), duration (F = 6.068, p = .004), adherence (F = 4.833, p = .010) and the number of primary task support elements (F = 5.631, p = .005). Our final regression model explained 55% of the variance in adherence. In this model, a RCT study as opposed to an observational study, increased interaction with a counselor, more frequent intended usage, more frequent updates and more extensive employment of dialogue support significantly predicted better adherence. Conclusions Using intervention characteristics and persuasive technology elements, a substantial amount of variance in adherence can be explained. Although there are differences between health care areas on intervention characteristics, health care area per se does not predict adherence. Rather, the differences in technology and interaction predict adherence. The results of this study can be used to make an informed decision about how to design a web-based intervention to which patients are more likely to adhere.
                Bookmark

                Author and article information

                Contributors
                Journal
                J Med Internet Res
                J. Med. Internet Res
                JMIR
                Journal of Medical Internet Research
                JMIR Publications (Toronto, Canada )
                1439-4456
                1438-8871
                March 2017
                21 March 2017
                : 19
                : 3
                : e82
                Affiliations
                [1] 1Psychiatry Research The Feinstein Institute for Medical Research Glen Oaks, NYUnited States
                [2] 2Northwell Hofstra School of Medicine Hempstead, NYUnited States
                Author notes
                Corresponding Author: Amit Baumel abaumel@ 123456northwell.edu
                Author information
                http://orcid.org/0000-0002-7043-8898
                http://orcid.org/0000-0002-3647-5505
                http://orcid.org/0000-0002-6573-844X
                http://orcid.org/0000-0002-2628-9442
                http://orcid.org/0000-0002-2483-9594
                Article
                v19i3e82
                10.2196/jmir.7270
                5380814
                28325712
                a620bdf9-6a57-4027-a003-2e1978cb6d0b
                ©Amit Baumel, Keren Faber, Nandita Mathur, John M Kane, Fred Muench. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 21.03.2017.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License ( http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.

                History
                : 9 January 2017
                : 2 February 2017
                : 21 February 2017
                : 22 February 2017
                Categories
                Original Paper
                Original Paper

                Medicine
                ehealth,mhealth,assessment,evaluation,quality,persuasive design,behavior change,therapeutic alliance
                Medicine
                ehealth, mhealth, assessment, evaluation, quality, persuasive design, behavior change, therapeutic alliance

                Comments

                Comment on this article