Search for authorsSearch for similar articles
0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The path forward for large language models in medicine is open

      brief-report
      1 , , 2 , 3
      NPJ Digital Medicine
      Nature Publishing Group UK
      Health care, Health policy

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Large language models (LLMs) are increasingly applied in medical documentation and have been proposed for clinical decision support. We argue that the future for LLMs in medicine must be based on transparent and controllable open-source models. Openness enables medical tool developers to control the safety and quality of underlying AI models, while also allowing healthcare professionals to hold these models accountable. For these reasons, the future is open.

          Related collections

          Most cited references6

          • Record: found
          • Abstract: not found
          • Article: not found

          Large language models in medicine

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Large language model AI chatbots require approval as medical devices

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              A future role for health applications of large language models depends on regulators enforcing safety standards.

              Among the rapid integration of artificial intelligence in clinical settings, large language models (LLMs), such as Generative Pre-trained Transformer-4, have emerged as multifaceted tools that have potential for health-care delivery, diagnosis, and patient care. However, deployment of LLMs raises substantial regulatory and safety concerns. Due to their high output variability, poor inherent explainability, and the risk of so-called AI hallucinations, LLM-based health-care applications that serve a medical purpose face regulatory challenges for approval as medical devices under US and EU laws, including the recently passed EU Artificial Intelligence Act. Despite unaddressed risks for patients, including misdiagnosis and unverified medical advice, such applications are available on the market. The regulatory ambiguity surrounding these tools creates an urgent need for frameworks that accommodate their unique capabilities and limitations. Alongside the development of these frameworks, existing regulations should be enforced. If regulators fear enforcing the regulations in a market dominated by supply or development by large technology companies, the consequences of layperson harm will force belated action, damaging the potentiality of LLM-based applications for layperson medical advice.
                Bookmark

                Author and article information

                Contributors
                lars.riedemann@med.uni-heidelberg.de
                Journal
                NPJ Digit Med
                NPJ Digit Med
                NPJ Digital Medicine
                Nature Publishing Group UK (London )
                2398-6352
                27 November 2024
                27 November 2024
                2024
                : 7
                : 339
                Affiliations
                [1 ]GRID grid.5253.1, ISNI 0000 0001 0328 4908, Department of Neurology, , Heidelberg University Hospital, ; Im Neuenheimer Feld 400, 69120 Heidelberg, Germany
                [2 ]Liquid AI, Inc., 314 Main St., Cambridge, MA 02142 USA
                [3 ]Else Kröner Fresenius Center for Digital Health, TUD Dresden University of Technology, ( https://ror.org/042aqky30) Fetscherstr. 74, 01307 Dresden, Germany
                Article
                1344
                10.1038/s41746-024-01344-w
                11603019
                39604549
                9a2f202b-bc2f-4606-b54b-d461c537a46a
                © The Author(s) 2024

                Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

                History
                : 8 October 2024
                : 14 November 2024
                Funding
                Funded by: FundRef http://dx.doi.org/10.13039/501100002347, Bundesministerium für Bildung und Forschung;
                Award ID: 16KISA100K
                Categories
                News & Views
                Custom metadata
                © Springer Nature Limited 2024

                health care,health policy
                health care, health policy

                Comments

                Comment on this article