17
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The NIH Open Citation Collection: A public access, broad coverage resource

      other

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Citation data have remained hidden behind proprietary, restrictive licensing agreements, which raises barriers to entry for analysts wishing to use the data, increases the expense of performing large-scale analyses, and reduces the robustness and reproducibility of the conclusions. For the past several years, the National Institutes of Health (NIH) Office of Portfolio Analysis (OPA) has been aggregating and enhancing citation data that can be shared publicly. Here, we describe the NIH Open Citation Collection (NIH-OCC), a public access database for biomedical research that is made freely available to the community. This dataset, which has been carefully generated from unrestricted data sources such as MedLine, PubMed Central (PMC), and CrossRef, now underlies the citation statistics delivered in the NIH iCite analytic platform. We have also included data from a machine learning pipeline that identifies, extracts, resolves, and disambiguates references from full-text articles available on the internet. Open citation links are available to the public in a major update of iCite ( https://icite.od.nih.gov).

          Abstract

          In this Community Page article, authors from the National Institutes of Health describe the Open Citation Collection, a public domain citation database that covers articles indexed in PubMed. This free resource is made available through the iCite web interface at icite.od.nih.gov and through bulk downloads.

          Related collections

          Most cited references12

          • Record: found
          • Abstract: found
          • Article: not found

          Quantifying the evolution of individual scientific impact.

          Despite the frequent use of numerous quantitative indicators to gauge the professional impact of a scientist, little is known about how scientific impact emerges and evolves in time. Here, we quantify the changes in impact and productivity throughout a career in science, finding that impact, as measured by influential publications, is distributed randomly within a scientist's sequence of publications. This random-impact rule allows us to formulate a stochastic model that uncouples the effects of productivity, individual ability, and luck and unveils the existence of universal patterns governing the emergence of scientific success. The model assigns a unique individual parameter Q to each scientist, which is stable during a career, and it accurately predicts the evolution of a scientist's impact, from the h-index to cumulative citations, and independent recognitions, such as prizes.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Quantifying long-term scientific impact.

            The lack of predictability of citation-based measures frequently used to gauge impact, from impact factors to short-term citations, raises a fundamental question: Is there long-term predictability in citation patterns? Here, we derive a mechanistic model for the citation dynamics of individual papers, allowing us to collapse the citation histories of papers from different journals and disciplines into a single curve, indicating that all papers tend to follow the same universal temporal pattern. The observed patterns not only help us uncover basic mechanisms that govern scientific impact but also offer reliable measures of influence that may have potential policy implications.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Meta-research: Evaluation and Improvement of Research Methods and Practices

              As the scientific enterprise has grown in size and diversity, we need empirical evidence on the research process to test and apply interventions that make it more efficient and its results more reliable. Meta-research is an evolving scientific discipline that aims to evaluate and improve research practices. It includes thematic areas of methods, reporting, reproducibility, evaluation, and incentives (how to do, report, verify, correct, and reward science). Much work is already done in this growing field, but efforts to-date are fragmented. We provide a map of ongoing efforts and discuss plans for connecting the multiple meta-research efforts across science worldwide.
                Bookmark

                Author and article information

                Journal
                PLoS Biol
                PLoS Biol
                plos
                plosbiol
                PLoS Biology
                Public Library of Science (San Francisco, CA USA )
                1544-9173
                1545-7885
                10 October 2019
                October 2019
                10 October 2019
                : 17
                : 10
                : e3000385
                Affiliations
                [1 ] Office of Portfolio Analysis, Division of Program Coordination, Planning, and Strategic Initiatives, Office of the Director, National Institutes of Health, Bethesda, Maryland, United States of America
                [2 ] UberResearch GmbH, Cologne, Germany
                Author notes

                The authors have declared that no competing interests exist.

                Author information
                http://orcid.org/0000-0001-7657-552X
                http://orcid.org/0000-0002-9004-3041
                http://orcid.org/0000-0002-2485-6458
                http://orcid.org/0000-0003-0294-2424
                http://orcid.org/0000-0002-6577-3106
                http://orcid.org/0000-0002-7201-3164
                Article
                PBIOLOGY-D-19-01609
                10.1371/journal.pbio.3000385
                6786512
                31600197
                7bd00e32-4fa7-49b5-b522-069dd78856ff

                This is an open access article, free of all copyright, and may be freely reproduced, distributed, transmitted, modified, built upon, or otherwise used by anyone for any lawful purpose. The work is made available under the Creative Commons CC0 public domain dedication.

                History
                Page count
                Figures: 2, Tables: 0, Pages: 6
                Funding
                The authors are employees of or contractors for the US Federal Government, but the authors received no specific funding for this work.
                Categories
                Community Page
                Research and Analysis Methods
                Research Assessment
                Citation Analysis
                Computer and Information Sciences
                Artificial Intelligence
                Machine Learning
                Computer and Information Sciences
                Computer Applications
                Web-Based Applications
                Research and Analysis Methods
                Research Assessment
                Reproducibility
                Computer and Information Sciences
                Computer Networks
                Internet
                Computer and Information Sciences
                Information Technology
                Data Processing
                Engineering and Technology
                Technology Development
                Prototypes
                Computer and Information Sciences
                Data Visualization
                Custom metadata
                Data can be retrieved from the iCite web service at https://icite.od.nih.gov, through the iCite API at https://icite.od.nih.gov/api, or in bulk downloads from https://doi.org/10.35092/yhjc.c.4586573.

                Life sciences
                Life sciences

                Comments

                Comment on this article