9
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      Submit your digital health research with an established publisher
      - celebrating 25 years of open access

      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Smartphone and Mobile Health Apps for Tinnitus: Systematic Identification, Analysis, and Assessment

      research-article

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          Modern smartphones contain sophisticated high-end hardware features, offering high computational capabilities at extremely manageable costs and have undoubtedly become an integral part in users' daily life. Additionally, smartphones offer a well-established ecosystem that is easily discoverable and accessible via the marketplaces of differing mobile platforms, thus encouraging the development of many smartphone apps. Such apps are not exclusively used for entertainment purposes but are also commonplace in health care and medical use. A variety of those health and medical apps exist within the context of tinnitus, a phantom sound perception in the absence of any physical external source.

          Objective

          In this paper, we shed light on existing smartphone apps addressing tinnitus by providing an up-to-date overview.

          Methods

          Based on PRISMA guidelines, we systematically searched and identified existing smartphone apps on the most prominent app markets, namely Google Play Store and Apple App Store. In addition, we applied the Mobile App Rating Scale (MARS) to evaluate and assess the apps in terms of their general quality and in-depth user experience.

          Results

          Our systematic search and screening of smartphone apps yielded a total of 34 apps (34 Android apps, 26 iOS apps). The mean MARS scores (out of 5) ranged between 2.65-4.60. The Tinnitus Peace smartphone app had the lowest score (mean 2.65, SD 0.20), and Sanvello—Stress and Anxiety Help had the highest MARS score (mean 4.60, SD 0.10). The interrater agreement was substantial (Fleiss κ=0.74), the internal consistency was excellent (Cronbach α=.95), and the interrater reliability was found to be both high and excellent—Guttman λ6=0.94 and intraclass correlation, ICC(2,k) 0.94 (95% CI 0.91-0.97), respectively.

          Conclusions

          This work demonstrated that there exists a plethora of smartphone apps for tinnitus. All of the apps received MARS scores higher than 2, suggesting that they all have some technical functional value. However, nearly all identified apps were lacking in terms of scientific evidence, suggesting the need for stringent clinical validation of smartphone apps in future. To the best of our knowledge, this work is the first to systematically identify and evaluate smartphone apps within the context of tinnitus.

          Related collections

          Most cited references50

          • Record: found
          • Abstract: found
          • Article: not found

          Mobile devices and apps for health care professionals: uses and benefits.

          Health care professionals' use of mobile devices is transforming clinical practice. Numerous medical software applications can now help with tasks ranging from information and time management to clinical decision-making at the point of care.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Enlight: A Comprehensive Quality and Therapeutic Potential Evaluation Tool for Mobile and Web-Based eHealth Interventions

            Background Studies of criteria-based assessment tools have demonstrated the feasibility of objectively evaluating eHealth interventions independent of empirical testing. However, current tools have not included some quality constructs associated with intervention outcome, such as persuasive design, behavior change, or therapeutic alliance. In addition, the generalizability of such tools has not been explicitly examined. Objective The aim is to introduce the development and further analysis of the Enlight suite of measures, developed to incorporate the aforementioned concepts and address generalizability aspects. Methods As a first step, a comprehensive systematic review was performed to identify relevant quality rating criteria in line with the PRISMA statement. These criteria were then categorized to create Enlight. The second step involved testing Enlight on 42 mobile apps and 42 Web-based programs (delivery mediums) targeting modifiable behaviors related to medical illness or mental health (clinical aims). Results A total of 476 criteria from 99 identified sources were used to build Enlight. The rating measures were divided into two sections: quality assessments and checklists. Quality assessments included usability, visual design, user engagement, content, therapeutic persuasiveness, therapeutic alliance, and general subjective evaluation. The checklists included credibility, privacy explanation, basic security, and evidence-based program ranking. The quality constructs exhibited excellent interrater reliability (intraclass correlations=.77-.98, median .91) and internal consistency (Cronbach alphas=.83-.90, median .88), with similar results when separated into delivery mediums or clinical aims. Conditional probability analysis revealed that 100% of the programs that received a score of fair or above (≥3.0) in therapeutic persuasiveness or therapeutic alliance received the same range of scores in user engagement and content—a pattern that did not appear in the opposite direction. Preliminary concurrent validity analysis pointed to positive correlations of combined quality scores with selected variables. The combined score that did not include therapeutic persuasiveness and therapeutic alliance descriptively underperformed the other combined scores. Conclusions This paper provides empirical evidence supporting the importance of persuasive design and therapeutic alliance within the context of a program’s evaluation. Reliability metrics and preliminary concurrent validity analysis indicate the potential of Enlight in examining eHealth programs regardless of delivery mediums and clinical aims.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Preliminary evaluation of PTSD Coach, a smartphone app for post-traumatic stress symptoms.

              PTSD Coach is a mobile application (app) designed to help individuals who have post-traumatic stress disorder (PTSD) symptoms better understand and self-manage their symptoms. It has wide-scale use (over 130,000 downloads in 78 countries) and very favorable reviews but has yet to be evaluated. Therefore, this study examines user satisfaction, perceived helpfulness, and usage patterns of PTSD Coach in a sample of 45 veterans receiving PTSD treatment. After using PTSD Coach for several days, participants completed a survey of satisfaction and perceived helpfulness and focus groups exploring app use and benefit from use. Data indicate that participants were very satisfied with PTSD Coach and perceived it as being moderately to very helpful with their PTSD symptoms. Analysis of focus group data resulted in several categories of app use: to manage acute distress and PTSD symptoms, at scheduled times, and to help with sleep. These findings offer preliminary support for the acceptability and perceived helpfulness of PTSD Coach and suggest that it has potential to be an effective self-management tool for PTSD. Although promising, future research is required to validate this, given study limitations. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.
                Bookmark

                Author and article information

                Contributors
                Journal
                JMIR Mhealth Uhealth
                JMIR Mhealth Uhealth
                JMU
                JMIR mHealth and uHealth
                JMIR Publications (Toronto, Canada )
                2291-5222
                August 2020
                18 August 2020
                : 8
                : 8
                : e21767
                Affiliations
                [1 ] Institute of Distributed Systems Ulm University Ulm Germany
                [2 ] Institute of Databases and Information Systems Ulm University Ulm Germany
                [3 ] Department of Psychology University of Zurich Zurich Switzerland
                [4 ] Clinic and Policlinic for Psychiatry and Psychotherapy Regensburg Germany
                [5 ] URPP Dynamics of Healthy Aging University of Zurich Zurich Switzerland
                [6 ] Institute of Clinical Epidemiology and Biometry University of Würzburg Würzburg Germany
                Author notes
                Corresponding Author: Franz J Hauck franz.hauck@ 123456uni-ulm.de
                Author information
                https://orcid.org/0000-0003-1083-6808
                https://orcid.org/0000-0001-9422-5523
                https://orcid.org/0000-0002-6006-7018
                https://orcid.org/0000-0003-3174-4910
                https://orcid.org/0000-0003-1840-0056
                https://orcid.org/0000-0003-1522-785X
                https://orcid.org/0000-0001-7942-1788
                https://orcid.org/0000-0003-2536-4153
                https://orcid.org/0000-0002-7480-9617
                Article
                v8i8e21767
                10.2196/21767
                7463412
                32808939
                59dca0ca-dba7-4e92-a9c0-f15cc4fbd168
                ©Muntazir Mehdi, Michael Stach, Constanze Riha, Patrick Neff, Albi Dode, Rüdiger Pryss, Winfried Schlee, Manfred Reichert, Franz J Hauck. Originally published in JMIR mHealth and uHealth (http://mhealth.jmir.org), 18.08.2020.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mHealth and uHealth, is properly cited. The complete bibliographic information, a link to the original publication on http://mhealth.jmir.org/, as well as this copyright and license information must be included.

                History
                : 24 June 2020
                : 11 July 2020
                : 28 July 2020
                : 29 July 2020
                Categories
                Original Paper
                Original Paper

                health care,mobile health,smartphone apps,mobile apps,tinnitus,app quality assessment and evaluation,mars

                Comments

                Comment on this article