1
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Random forest-based prediction of decay modes and half-lives of superheavy nuclei

      ,
      Nuclear Science and Techniques
      Springer Science and Business Media LLC

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references69

          • Record: found
          • Abstract: not found
          • Article: not found

          The discovery of the heaviest elements

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            The limits of the nuclear landscape

            In 2011, 100 new nuclides were discovered. They joined the approximately 3,000 stable and radioactive nuclides that either occur naturally on Earth or are synthesized in the laboratory. Every atomic nucleus, characterized by a specific number of protons and neutrons, occupies a spot on the chart of nuclides, which is bounded by 'drip lines' indicating the values of neutron and proton number at which nuclear binding ends. The placement of the neutron drip line for the heavier elements is based on theoretical predictions using extreme extrapolations, and so is uncertain. However, it is not known how uncertain it is or how many protons and neutrons can be bound in a nucleus. Here we estimate these limits of the nuclear 'landscape' and provide statistical and systematic uncertainties for our predictions. We use nuclear density functional theory, several Skyrme interactions and high-performance computing, and find that the number of bound nuclides with between 2 and 120 protons is around 7,000. We find that extrapolations for drip-line positions and selected nuclear properties, including neutron separation energies relevant to astrophysical processes, are very consistent between the models used.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found
              Is Open Access

              A high-bias, low-variance introduction to Machine Learning for physicists

              Machine Learning (ML) is one of the most exciting and dynamic areas of modern research and application. The purpose of this review is to provide an introduction to the core concepts and tools of machine learning in a manner easily understood and intuitive to physicists. The review begins by covering fundamental concepts in ML and modern statistics such as the bias-variance tradeoff, overfitting, regularization, generalization, and gradient descent before moving on to more advanced topics in both supervised and unsupervised learning. Topics covered in the review include ensemble models, deep learning and neural networks, clustering and data visualization, energy-based models (including MaxEnt models and Restricted Boltzmann Machines), and variational methods. Throughout, we emphasize the many natural connections between ML and statistical physics. A notable aspect of the review is the use of Python Jupyter notebooks to introduce modern ML/statistical packages to readers using physics-inspired datasets (the Ising Model and Monte-Carlo simulations of supersymmetric decays of proton-proton collisions). We conclude with an extended outlook discussing possible uses of machine learning for furthering our understanding of the physical world as well as open problems in ML where physicists may be able to contribute.
                Bookmark

                Author and article information

                Contributors
                Journal
                Nuclear Science and Techniques
                NUCL SCI TECH
                Springer Science and Business Media LLC
                1001-8042
                2210-3147
                December 2023
                December 13 2023
                December 2023
                : 34
                : 12
                Article
                10.1007/s41365-023-01354-5
                f41f46e7-2bed-412d-9066-9f035c0623e7
                © 2023

                https://www.springernature.com/gp/researchers/text-and-data-mining

                https://www.springernature.com/gp/researchers/text-and-data-mining

                History

                Comments

                Comment on this article