5
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Massive computational acceleration by using neural networks to emulate mechanism-based biological models

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          For many biological applications, exploration of the massive parametric space of a mechanism-based model can impose a prohibitive computational demand. To overcome this limitation, we present a framework to improve computational efficiency by orders of magnitude. The key concept is to train a neural network using a limited number of simulations generated by a mechanistic model. This number is small enough such that the simulations can be completed in a short time frame but large enough to enable reliable training. The trained neural network can then be used to explore a much larger parametric space. We demonstrate this notion by training neural networks to predict pattern formation and stochastic gene expression. We further demonstrate that using an ensemble of neural networks enables the self-contained evaluation of the quality of each prediction. Our work can be a platform for fast parametric space screening of biological models with user defined objectives.

          Abstract

          Mechanistic models provide valuable insights, but large-scale simulations are computationally expensive. Here, the authors show that it is possible to explore the dynamics of a mechanistic model over a large set of parameters by training an artificial neural network on a smaller set of simulations.

          Related collections

          Most cited references55

          • Record: found
          • Abstract: found
          • Article: not found

          Long Short-Term Memory

          Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to open and close access to the constant error flow. LSTM is local in space and time; its computational complexity per time step and weight is O(1). Our experiments with artificial data involve local, distributed, real-valued, and noisy pattern representations. In comparisons with real-time recurrent learning, back propagation through time, recurrent cascade correlation, Elman nets, and neural sequence chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex, artificial long-time-lag tasks that have never been solved by previous recurrent network algorithms.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations

              Bookmark
              • Record: found
              • Abstract: not found
              • Conference Proceedings: not found

              Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification

                Bookmark

                Author and article information

                Contributors
                you@duke.edu
                Journal
                Nat Commun
                Nat Commun
                Nature Communications
                Nature Publishing Group UK (London )
                2041-1723
                25 September 2019
                25 September 2019
                2019
                : 10
                : 4354
                Affiliations
                [1 ]ISNI 0000 0004 1936 7961, GRID grid.26009.3d, Department of Biomedical Engineering, , Duke University, ; Durham, NC 27708 USA
                [2 ]ISNI 0000 0004 1936 7961, GRID grid.26009.3d, Department of Statistical Science, , Duke University, ; Durham, NC 27708 USA
                [3 ]ISNI 0000 0004 1936 7961, GRID grid.26009.3d, Center for Genomic and Computational Biology, , Duke University, ; Durham, NC 27708 USA
                [4 ]ISNI 0000 0004 1936 7961, GRID grid.26009.3d, Department of Molecular Genetics and Microbiology, , Duke University School of Medicine, ; Durham, NC 27708 USA
                Author information
                http://orcid.org/0000-0002-0093-1943
                http://orcid.org/0000-0003-3725-4007
                Article
                12342
                10.1038/s41467-019-12342-y
                6761138
                31554788
                eb359b51-1cc2-4e97-a056-fb5f4c02847e
                © The Author(s) 2019

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 4 March 2019
                : 30 August 2019
                Funding
                Funded by: FundRef https://doi.org/10.13039/100000006, United States Department of Defense | United States Navy | Office of Naval Research (ONR);
                Award ID: L.Y.: N00014-12-1-0631
                Award Recipient :
                Funded by: FundRef https://doi.org/10.13039/100000001, National Science Foundation (NSF);
                Award ID: L.Y.
                Award Recipient :
                Funded by: FundRef https://doi.org/10.13039/100000002, U.S. Department of Health & Human Services | National Institutes of Health (NIH);
                Award ID: L.Y.: 1R01-GM098642
                Award Recipient :
                Funded by: FundRef https://doi.org/10.13039/100000008, David and Lucile Packard Foundation (David & Lucile Packard Foundation);
                Award ID: L.Y.
                Award Recipient :
                Categories
                Article
                Custom metadata
                © The Author(s) 2019

                Uncategorized
                high-throughput screening,machine learning,synthetic biology,systems analysis
                Uncategorized
                high-throughput screening, machine learning, synthetic biology, systems analysis

                Comments

                Comment on this article