33
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      Submit your digital health research with an established publisher
      - celebrating 25 years of open access

      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Unhappy Patients Are Not Alike: Content Analysis of the Negative Comments from China's Good Doctor Website

      research-article

      Read this article at

      ScienceOpenPublisherPMC
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          With the rise in popularity of Web 2.0 technologies, the sharing of patient experiences about physicians on online forums and medical websites has become a common practice. However, negative comments posted by patients are considered to be more influential by other patients and physicians than those that are satisfactory.

          Objective

          The aim of this study was to analyze negative comments posted online about physicians and to identify possible solutions to improve patient satisfaction, as well as their relationship with physicians.

          Methods

          A Java-based program was developed to collect patient comments on the Good Doctor website, one of the most popular online health communities in China. A total of 3012 negative comments concerning 1029 physicians (mean 2.93 [SD 4.14]) from 5 highly ranked hospitals in Beijing were extracted for content analysis. An initial coding framework was constructed with 2 research assistants involved in the codification.

          Results

          Analysis, based on the collected 3012 negative comments, revealed that unhappy patients are not alike and that their complaints cover a wide range of issues experienced throughout the whole process of medical consultation. Among them, physicians in Obstetrics and Gynecology (606/3012, 20.12%; P=.001) and Internal Medicine (487/3012, 16.17%; P=.80) received the most negative comments. For negative comments per physician, Dermatology and Sexually Transmitted Diseases (mean 5.72, P<.001) and Andrology (mean 5, P=.02) ranked the highest. Complaints relating to insufficient medical consultation duration (577/3012, 19.16%), physician impatience (527/3012, 17.50%), and perceived poor therapeutic effect (370/3012, 12.28%) received the highest number of negative comments. Specific groups of people, such as those accompanying older patients or children, traveling patients, or very important person registrants, were shown to demonstrate little tolerance for poor medical service.

          Conclusions

          Analysis of online patient complaints provides an innovative approach to understand factors associated with patient dissatisfaction. The outcomes of this study could be of benefit to hospitals or physicians seeking to improve their delivery of patient-centered services. Patients are expected to be more understanding of overloaded physicians’ workloads, which are impacted by China’s stretched medical resources, as efforts are made to build more harmonious physician-patient relationships.

          Related collections

          Most cited references29

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Use of Sentiment Analysis for Capturing Patient Experience From Free-Text Comments Posted Online

          Background There are large amounts of unstructured, free-text information about quality of health care available on the Internet in blogs, social networks, and on physician rating websites that are not captured in a systematic way. New analytical techniques, such as sentiment analysis, may allow us to understand and use this information more effectively to improve the quality of health care. Objective We attempted to use machine learning to understand patients’ unstructured comments about their care. We used sentiment analysis techniques to categorize online free-text comments by patients as either positive or negative descriptions of their health care. We tried to automatically predict whether a patient would recommend a hospital, whether the hospital was clean, and whether they were treated with dignity from their free-text description, compared to the patient’s own quantitative rating of their care. Methods We applied machine learning techniques to all 6412 online comments about hospitals on the English National Health Service website in 2010 using Weka data-mining software. We also compared the results obtained from sentiment analysis with the paper-based national inpatient survey results at the hospital level using Spearman rank correlation for all 161 acute adult hospital trusts in England. Results There was 81%, 84%, and 89% agreement between quantitative ratings of care and those derived from free-text comments using sentiment analysis for cleanliness, being treated with dignity, and overall recommendation of hospital respectively (kappa scores: .40–.74, P<.001 for all). We observed mild to moderate associations between our machine learning predictions and responses to the large patient survey for the three categories examined (Spearman rho 0.37-0.51, P<.001 for all). Conclusions The prediction accuracy that we have achieved using this machine learning process suggests that we are able to predict, from free-text, a reasonably accurate assessment of patients’ opinion about different performance aspects of a hospital and that these machine learning predictions are associated with results of more conventional surveys.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Violence against doctors: Why China? Why now? What next?

            The Lancet (2014)
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found
              Is Open Access

              Measuring patient-perceived quality of care in US hospitals using Twitter

              Background Patients routinely use Twitter to share feedback about their experience receiving healthcare. Identifying and analysing the content of posts sent to hospitals may provide a novel real-time measure of quality, supplementing traditional, survey-based approaches. Objective To assess the use of Twitter as a supplemental data stream for measuring patient-perceived quality of care in US hospitals and compare patient sentiments about hospitals with established quality measures. Design 404 065 tweets directed to 2349 US hospitals over a 1-year period were classified as having to do with patient experience using a machine learning approach. Sentiment was calculated for these tweets using natural language processing. 11 602 tweets were manually categorised into patient experience topics. Finally, hospitals with ≥50 patient experience tweets were surveyed to understand how they use Twitter to interact with patients. Key results Roughly half of the hospitals in the US have a presence on Twitter. Of the tweets directed toward these hospitals, 34 725 (9.4%) were related to patient experience and covered diverse topics. Analyses limited to hospitals with ≥50 patient experience tweets revealed that they were more active on Twitter, more likely to be below the national median of Medicare patients (p<0.001) and above the national median for nurse/patient ratio (p=0.006), and to be a nonprofit hospital (p<0.001). After adjusting for hospital characteristics, we found that Twitter sentiment was not associated with Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) ratings (but having a Twitter account was), although there was a weak association with 30-day hospital readmission rates (p=0.003). Conclusions Tweets describing patient experiences in hospitals cover a wide range of patient care aspects and can be identified using automated approaches. These tweets represent a potentially untapped indicator of quality and may be valuable to patients, researchers, policy makers and hospital administrators.
                Bookmark

                Author and article information

                Contributors
                Journal
                J Med Internet Res
                J. Med. Internet Res
                JMIR
                Journal of Medical Internet Research
                JMIR Publications (Toronto, Canada )
                1439-4456
                1438-8871
                January 2018
                25 January 2018
                : 20
                : 1
                : e35
                Affiliations
                [1] 1 Institute of Smart Health School of Medicine and Health Management Huazhong University of Science and Technology Wuhan China
                [2] 2 Department of Business Information Management and Operations University of Westminster London United Kingdom
                [3] 3 School of Public Administration Guangzhou University Guangzhou China
                Author notes
                Corresponding Author: Zhaohua Deng zh-deng@ 123456hust.edu.cn
                Author information
                http://orcid.org/0000-0003-0178-0750
                http://orcid.org/0000-0002-7744-7818
                http://orcid.org/0000-0003-4160-9739
                http://orcid.org/0000-0001-6367-0560
                http://orcid.org/0000-0003-2744-4579
                http://orcid.org/0000-0002-8516-309X
                Article
                v20i1e35
                10.2196/jmir.8223
                5806007
                29371176
                bcd00ac5-b7c5-4719-a0fc-1c67c13373cb
                ©Wei Zhang, Zhaohua Deng, Ziying Hong, Richard Evans, Jingdong Ma, Hui Zhang. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 25.01.2018.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.

                History
                : 16 June 2017
                : 1 September 2017
                : 16 October 2017
                : 1 December 2017
                Categories
                Original Paper
                Original Paper

                Medicine
                patient satisfaction,physician-patient relationship,good doctors website,patient complaint.

                Comments

                Comment on this article