Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
150
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Being Critical of Criticality in the Brain

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Relatively recent work has reported that networks of neurons can produce avalanches of activity whose sizes follow a power law distribution. This suggests that these networks may be operating near a critical point, poised between a phase where activity rapidly dies out and a phase where activity is amplified over time. The hypothesis that the electrical activity of neural networks in the brain is critical is potentially important, as many simulations suggest that information processing functions would be optimized at the critical point. This hypothesis, however, is still controversial. Here we will explain the concept of criticality and review the substantial objections to the criticality hypothesis raised by skeptics. Points and counter points are presented in dialog form.

          Related collections

          Most cited references109

          • Record: found
          • Abstract: found
          • Article: not found

          Real-time computing without stable states: a new framework for neural computation based on perturbations.

          A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Weak pairwise correlations imply strongly correlated network states in a neural population

            Biological networks have so many possible states that exhaustive sampling is impossible. Successful analysis thus depends on simplifying hypotheses, but experiments on many systems hint that complicated, higher order interactions among large groups of elements play an important role. In the vertebrate retina, we show that weak correlations between pairs of neurons coexist with strongly collective behavior in the responses of ten or more neurons. Surprisingly, we find that this collective behavior is described quantitatively by models that capture the observed pairwise correlations but assume no higher order interactions. These maximum entropy models are equivalent to Ising models, and predict that larger networks are completely dominated by correlation effects. This suggests that the neural code has associative or error-correcting properties, and we provide preliminary evidence for such behavior. As a first test for the generality of these ideas, we show that similar results are obtained from networks of cultured cortical neurons.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Long-range temporal correlations and scaling behavior in human brain oscillations.

              The human brain spontaneously generates neural oscillations with a large variability in frequency, amplitude, duration, and recurrence. Little, however, is known about the long-term spatiotemporal structure of the complex patterns of ongoing activity. A central unresolved issue is whether fluctuations in oscillatory activity reflect a memory of the dynamics of the system for more than a few seconds. We investigated the temporal correlations of network oscillations in the normal human brain at time scales ranging from a few seconds to several minutes. Ongoing activity during eyes-open and eyes-closed conditions was recorded with simultaneous magnetoencephalography and electroencephalography. Here we show that amplitude fluctuations of 10 and 20 Hz oscillations are correlated over thousands of oscillation cycles. Our analyses also indicated that these amplitude fluctuations obey power-law scaling behavior. The scaling exponents were highly invariant across subjects. We propose that the large variability, the long-range correlations, and the power-law scaling behavior of spontaneous oscillations find a unifying explanation within the theory of self-organized criticality, which offers a general mechanism for the emergence of correlations and complex dynamics in stochastic multiunit systems. The demonstrated scaling laws pose novel quantitative constraints on computational models of network oscillations. We argue that critical-state dynamics of spontaneous oscillations may lend neural networks capable of quick reorganization during processing demands.
                Bookmark

                Author and article information

                Journal
                Front Physiol
                Front Physiol
                Front. Physio.
                Frontiers in Physiology
                Frontiers Research Foundation
                1664-042X
                07 June 2012
                2012
                : 3
                : 163
                Affiliations
                [1] 1simpleDepartment of Physics, Indiana University Bloomington, IN, USA
                [2] 2simpleBiocomplexity Institute, Indiana University Bloomington, IN, USA
                Author notes

                Edited by: Tjeerd W. Boonstra, University of New South Wales, Australia

                Reviewed by: Alain Destexhe, Information and Complexité Centre National de la Recherche Scientifique, France; Woodrow Shew, University of Arkansas, USA

                *Correspondence: John M. Beggs, Department of Physics, Indiana University, 727 East, 3rd Street, Bloomington, IN 47405-7105, USA. e-mail: jmbeggs@ 123456indiana.edu

                This article was submitted to Frontiers in Fractal Physiology, a specialty of Frontiers in Physiology.

                Article
                10.3389/fphys.2012.00163
                3369250
                22701101
                6b027314-662f-4d42-bc7c-7b0c5da19532
                Copyright © 2012 Beggs and Timme.

                This is an open-access article distributed under the terms of the Creative Commons Attribution Non Commercial License, which permits non-commercial use, distribution, and reproduction in other forums, provided the original authors and source are credited.

                History
                : 21 February 2012
                : 07 May 2012
                Page count
                Figures: 9, Tables: 0, Equations: 6, References: 120, Pages: 14, Words: 13647
                Categories
                Physiology
                Review Article

                Anatomy & Physiology
                multi-electrode array,scale-free,network,avalanche,statistical physics,criticality,ising model

                Comments

                Comment on this article