10
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      On the Maximum Storage Capacity of the Hopfield Model

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Recurrent neural networks (RNN) have traditionally been of great interest for their capacity to store memories. In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN. Analyzing the thermodynamic limit of the statistical properties of the Hamiltonian corresponding to the Hopfield neural network, it has been shown in the literature that the retrieval errors diverge when the number of stored memory patterns ( P) exceeds a fraction (≈ 14%) of the network size N. In this paper, we study the storage performance of a generalized Hopfield model, where the diagonal elements of the connection matrix are allowed to be different from zero. We investigate this model at finite N. We give an analytical expression for the number of retrieval errors and show that, by increasing the number of stored patterns over a certain threshold, the errors start to decrease and reach values below unit for PN. We demonstrate that the strongest trade-off between efficiency and effectiveness relies on the number of patterns ( P) that are stored in the network by appropriately fixing the connection weights. When PN and the diagonal elements of the adjacency matrix are not forced to be zero, the optimal storage capacity is obtained with a number of stored memories much larger than previously reported. This theory paves the way to the design of RNN with high storage capacity and able to retrieve the desired pattern without distortions.

          Related collections

          Most cited references15

          • Record: found
          • Abstract: found
          • Article: not found

          Neural networks and physical systems with emergent collective computational abilities.

          J Hopfield (1982)
          Computational properties of use of biological organisms or to the construction of computers can emerge as collective properties of systems having a large number of simple equivalent components (or neurons). The physical meaning of content-addressable memory is described by an appropriate phase space flow of the state of a system. A model of such a system is given, based on aspects of neurobiology but readily adapted to integrated circuits. The collective properties of this model produce a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size. The algorithm for the time evolution of the state of the system is based on asynchronous parallel processing. Additional emergent collective properties include some capacity for generalization, familiarity recognition, categorization, error correction, and time sequence retention. The collective properties are only weakly sensitive to details of the modeling or the failure of individual devices.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Neurons with graded response have collective computational properties like those of two-state neurons.

            J Hopfield (1984)
            A model for a large network of "neurons" with a graded response (or sigmoid input-output relation) is studied. This deterministic system has collective properties in very close correspondence with the earlier stochastic model based on McCulloch - Pitts neurons. The content- addressable memory and other emergent collective properties of the original model also are present in the graded response model. The idea that such collective properties are used in biological systems is given added credence by the continued presence of such properties for more nearly biological "neurons." Collective analog electrical circuits of the kind described will certainly function. The collective states of the two models have a simple correspondence. The original model will continue to be useful for simulations, because its connection to graded response systems is established. Equations that include the effect of action potentials in the graded response system are also developed.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Sinc approximation of algebraically decaying functions

              An extension of sinc interpolation on \(\mathbb{R}\) to the class of algebraically decaying functions is developed in the paper. Similarly to the classical sinc interpolation we establish two types of error estimates. First covers a wider class of functions with the algebraic order of decay on \(\mathbb{R}\). The second type of error estimates governs the case when the order of function's decay can be estimated everywhere in the horizontal strip of complex plane around \(\mathbb{R}\). The numerical examples are provided.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Comput Neurosci
                Front Comput Neurosci
                Front. Comput. Neurosci.
                Frontiers in Computational Neuroscience
                Frontiers Media S.A.
                1662-5188
                10 January 2017
                2016
                : 10
                : 144
                Affiliations
                [1] 1Center for Life Nanoscience, Istituto Italiano di Tecnologia Rome, Italy
                [2] 2Department of Physics, Sapienza University of Rome Rome, Italy
                Author notes

                Edited by: Marcel Van Gerven, Radboud University Nijmegen, Netherlands

                Reviewed by: Simon R. Schultz, Imperial College London, UK; Fleur Zeldenrust, Donders Institute for Brain, Cognition and Behaviour, Netherlands

                *Correspondence: Viola Folli viola.folli@ 123456iit.it
                Article
                10.3389/fncom.2016.00144
                5222833
                28119595
                cf55aa2c-973e-43b5-af38-29ad2d27bf5f
                Copyright © 2017 Folli, Leonetti and Ruocco.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 19 September 2016
                : 20 December 2016
                Page count
                Figures: 7, Tables: 0, Equations: 18, References: 24, Pages: 9, Words: 6695
                Categories
                Neuroscience
                Original Research

                Neurosciences
                maximum storage memory,feed-forward structure,random recurrent network,hopfield model,retrieval error

                Comments

                Comment on this article