25
views
0
recommends
+1 Recommend
1 collections
    1
    shares

      To submit to the journal, please click here

      We invite news and articles concerning all aspects of academic and professional publishing. Papers are welcomed from across the scholarly publishing community.

      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Artificial intelligence to support publishing and peer review: A summary and review

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Technology is being developed to support the peer review processes of journals, conferences, funders, universities, and national research evaluations. This literature and software summary discusses the partial or complete automation of several publishing‐related tasks: suggesting appropriate journals for an article, providing quality control for submitted papers, finding suitable reviewers for submitted papers or grant proposals, reviewing, and review evaluation. It also discusses attempts to estimate article quality from peer review text and scores as well as from post‐publication scores but not from bibliometric data. The literature and existing examples of working technology show that automation is useful for helping to find reviewers and there is good evidence that it can sometimes help with initial quality control of submitted manuscripts. Much other software supporting publishing and editorial work exists and is being used, but without published academic evaluations of its efficacy. The value of artificial intelligence (AI) to support reviewing has not been clearly demonstrated yet, however. Finally, whilst peer review text and scores can theoretically have value for post‐publication research assessment, it is not yet widely enough available to be a practical evidence source for systematic automation.

          Related collections

          Most cited references54

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          The Global Burden of Journal Peer Review in the Biomedical Literature: Strong Imbalance in the Collective Enterprise

          The growth in scientific production may threaten the capacity for the scientific community to handle the ever-increasing demand for peer review of scientific publications. There is little evidence regarding the sustainability of the peer-review system and how the scientific community copes with the burden it poses. We used mathematical modeling to estimate the overall quantitative annual demand for peer review and the supply in biomedical research. The modeling was informed by empirical data from various sources in the biomedical domain, including all articles indexed at MEDLINE. We found that for 2015, across a range of scenarios, the supply exceeded by 15% to 249% the demand for reviewers and reviews. However, 20% of the researchers performed 69% to 94% of the reviews. Among researchers actually contributing to peer review, 70% dedicated 1% or less of their research work-time to peer review while 5% dedicated 13% or more of it. An estimated 63.4 million hours were devoted to peer review in 2015, among which 18.9 million hours were provided by the top 5% contributing reviewers. Our results support that the system is sustainable in terms of volume but emphasizes a considerable imbalance in the distribution of the peer-review effort across the scientific community. Finally, various individual interactions between authors, editors and reviewers may reduce to some extent the number of reviewers who are available to editors at any point.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            ChatGPT and a new academic reality: Artificial Intelligence‐written research papers and the ethics of the large language models in scholarly publishing

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Looking for Landmarks: The Role of Expert Review and Bibliometric Analysis in Evaluating Scientific Publication Outputs

              Objective To compare expert assessment with bibliometric indicators as tools to assess the quality and importance of scientific research papers. Methods and Materials Shortly after their publication in 2005, the quality and importance of a cohort of nearly 700 Wellcome Trust (WT) associated research papers were assessed by expert reviewers; each paper was reviewed by two WT expert reviewers. After 3 years, we compared this initial assessment with other measures of paper impact. Results Shortly after publication, 62 (9%) of the 687 research papers were determined to describe at least a ‘major addition to knowledge’ –6 were thought to be ‘landmark’ papers. At an aggregate level, after 3 years, there was a strong positive association between expert assessment and impact as measured by number of citations and F1000 rating. However, there were some important exceptions indicating that bibliometric measures may not be sufficient in isolation as measures of research quality and importance, and especially not for assessing single papers or small groups of research publications. Conclusion When attempting to assess the quality and importance of research papers, we found that sole reliance on bibliometric indicators would have led us to miss papers containing important results as judged by expert review. In particular, some papers that were highly rated by experts were not highly cited during the first three years after publication. Tools that link expert peer reviews of research paper quality and importance to more quantitative indicators, such as citation analysis would be valuable additions to the field of research assessment and evaluation.
                Bookmark

                Author and article information

                Contributors
                Journal
                Learned Publishing
                Learned Publishing
                John Wiley & Sons, Ltd.
                0953-1513
                1741-4857
                August 08 2023
                Affiliations
                [1 ] Statistical Cybermetrics and Research Evaluation Group University of Wolverhampton Wolverhampton UK
                [2 ] Information School University of Sheffield Sheffield UK
                Article
                10.1002/leap.1570
                ddaf24f0-5d3b-4e83-b570-9fe567e96826
                © 2023

                http://creativecommons.org/licenses/by/4.0/

                History

                Assessment, Evaluation & Research methods,Intellectual property law,Information & Library science,Communication & Media studies
                academic peer review,automation,artificial intelligence,academic publishing

                Comments

                Comment on this article