10
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Recurrent dynamics in the cerebral cortex: Integration of sensory evidence with stored knowledge

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Significance

          This review attempts to unite three hitherto rather unconnected concepts of basic functions of the cerebral cortex, taking the visual system as an example: 1) feed-forward processing in multilayer hierarchies (labeled line coding), 2) dynamic association of features (assembly coding), and 3) matching of sensory evidence with stored priors (predictive coding). The latter two functions are supposed to rely on the high-dimensional dynamics of delay-coupled recurrent networks. Discharge rates of neurons (rate code) and temporal relations among discharges (temporal code) are identified as conveying complementary information. Thus, the new concept accounts for the coexistence of feed-forward and recurrent processing, accommodates both rate and temporal codes, and assigns crucial functions to the complex dynamics emerging from recurrent interactions.

          Abstract

          Current concepts of sensory processing in the cerebral cortex emphasize serial extraction and recombination of features in hierarchically structured feed-forward networks in order to capture the relations among the components of perceptual objects. These concepts are implemented in convolutional deep learning networks and have been validated by the astounding similarities between the functional properties of artificial systems and their natural counterparts. However, cortical architectures also display an abundance of recurrent coupling within and between the layers of the processing hierarchy. This massive recurrence gives rise to highly complex dynamics whose putative function is poorly understood. Here a concept is proposed that assigns specific functions to the dynamics of cortical networks and combines, in a unifying approach, the respective advantages of recurrent and feed-forward processing. It is proposed that the priors about regularities of the world are stored in the weight distributions of feed-forward and recurrent connections and that the high-dimensional, dynamic space provided by recurrent interactions is exploited for computations. These comprise the ultrafast matching of sensory evidence with the priors covertly represented in the correlation structure of spontaneous activity and the context-dependent grouping of feature constellations characterizing natural objects. The concept posits that information is encoded not only in the discharge frequency of neurons but also in the precise timing relations among the discharges. Results of experiments designed to test the predictions derived from this concept support the hypothesis that cerebral cortex exploits the high-dimensional recurrent dynamics for computations serving predictive coding.

          Related collections

          Most cited references252

          • Record: found
          • Abstract: found
          • Article: not found

          Deep learning.

          Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in speech recognition, visual object recognition, object detection and many other domains such as drug discovery and genomics. Deep learning discovers intricate structure in large data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer. Deep convolutional nets have brought about breakthroughs in processing images, video, speech and audio, whereas recurrent nets have shone light on sequential data such as text and speech.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Long Short-Term Memory

            Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to open and close access to the constant error flow. LSTM is local in space and time; its computational complexity per time step and weight is O(1). Our experiments with artificial data involve local, distributed, real-valued, and noisy pattern representations. In comparisons with real-time recurrent learning, back propagation through time, recurrent cascade correlation, Elman nets, and neural sequence chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex, artificial long-time-lag tasks that have never been solved by previous recurrent network algorithms.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Neural networks and physical systems with emergent collective computational abilities.

              J Hopfield (1982)
              Computational properties of use of biological organisms or to the construction of computers can emerge as collective properties of systems having a large number of simple equivalent components (or neurons). The physical meaning of content-addressable memory is described by an appropriate phase space flow of the state of a system. A model of such a system is given, based on aspects of neurobiology but readily adapted to integrated circuits. The collective properties of this model produce a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size. The algorithm for the time evolution of the state of the system is based on asynchronous parallel processing. Additional emergent collective properties include some capacity for generalization, familiarity recognition, categorization, error correction, and time sequence retention. The collective properties are only weakly sensitive to details of the modeling or the failure of individual devices.
                Bookmark

                Author and article information

                Journal
                Proc Natl Acad Sci U S A
                Proc Natl Acad Sci U S A
                pnas
                PNAS
                Proceedings of the National Academy of Sciences of the United States of America
                National Academy of Sciences
                0027-8424
                1091-6490
                17 August 2021
                06 August 2021
                06 August 2021
                : 118
                : 33
                : e2101043118
                Affiliations
                [1] aErnst Strüngmann Institute for Neuroscience in Cooperation with Max Planck Society , Frankfurt am Main 60438, Germany;
                [2] bMax Planck Institute for Brain Research , Frankfurt am Main 60438, Germany;
                [3] cFrankfurt Institute for Advanced Studies , Frankfurt am Main 60438, Germany
                Author notes

                This contribution is part of the special series of Inaugural Articles by members of the National Academy of Sciences elected in 2017.

                Contributed by Wolf Singer, June 22, 2021 (sent for review January 18, 2021; reviewed by Michael E. Goldberg and Terrence J. Sejnowski)

                Author contributions: W.S. wrote the paper.

                Reviewers: M.E.G., Columbia University; and T.J.S., Salk Institute for Biological Studies.

                Article
                202101043
                10.1073/pnas.2101043118
                8379985
                34362837
                f31b9137-8bab-4bf3-98f7-0ae05fa8da54
                Copyright © 2021 the Author(s). Published by PNAS.

                This open access article is distributed under Creative Commons Attribution License 4.0 (CC BY).

                History
                Page count
                Pages: 12
                Funding
                Funded by: Deutsche Forschungsgemeinschaft (DFG) 501100001659
                Award ID: SI 505/22-1
                Award Recipient : Wolf Singer
                Funded by: Human Frontier Science Program (HFSP) 501100000854
                Award ID: RGP0044/2018 - Orban
                Award Recipient : Wolf Singer
                Categories
                442
                1
                424
                Biological Sciences
                Neuroscience
                Inaugural Article

                recurrent networks,neuronal dynamics,predictive coding,rate codes,temporal codes

                Comments

                Comment on this article