4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      A future role for health applications of large language models depends on regulators enforcing safety standards.

      Read this article at

      ScienceOpenPublisherPubMed
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Among the rapid integration of artificial intelligence in clinical settings, large language models (LLMs), such as Generative Pre-trained Transformer-4, have emerged as multifaceted tools that have potential for health-care delivery, diagnosis, and patient care. However, deployment of LLMs raises substantial regulatory and safety concerns. Due to their high output variability, poor inherent explainability, and the risk of so-called AI hallucinations, LLM-based health-care applications that serve a medical purpose face regulatory challenges for approval as medical devices under US and EU laws, including the recently passed EU Artificial Intelligence Act. Despite unaddressed risks for patients, including misdiagnosis and unverified medical advice, such applications are available on the market. The regulatory ambiguity surrounding these tools creates an urgent need for frameworks that accommodate their unique capabilities and limitations. Alongside the development of these frameworks, existing regulations should be enforced. If regulators fear enforcing the regulations in a market dominated by supply or development by large technology companies, the consequences of layperson harm will force belated action, damaging the potentiality of LLM-based applications for layperson medical advice.

          Related collections

          Author and article information

          Journal
          Lancet Digit Health
          The Lancet. Digital health
          Elsevier BV
          2589-7500
          2589-7500
          Sep 2024
          : 6
          : 9
          Affiliations
          [1 ] Else Kröner Fresenius Center for Digital Health, TUD Dresden University of Technology, Dresden, Germany.
          [2 ] Else Kröner Fresenius Center for Digital Health, TUD Dresden University of Technology, Dresden, Germany; Department of Medicine, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany.
          [3 ] Else Kröner Fresenius Center for Digital Health, TUD Dresden University of Technology, Dresden, Germany; Department of Medicine, University Hospital Dresden, Dresden, Germany; Medical Oncology, National Center for Tumor Diseases, University Hospital Heidelberg, Heidelberg, Germany.
          [4 ] Else Kröner Fresenius Center for Digital Health, TUD Dresden University of Technology, Dresden, Germany. Electronic address: stephen.gilbert@tu-dresden.de.
          Article
          S2589-7500(24)00124-9
          10.1016/S2589-7500(24)00124-9
          39179311
          75e802db-4986-4981-b7ea-8fa95ab54ad5
          History

          Comments

          Comment on this article