Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Information in a Network of Neuronal Cells: Effect of Cell Density and Short-Term Depression

      research-article

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Neurons are specialized, electrically excitable cells which use electrical to chemical signals to transmit and elaborate information. Understanding how the cooperation of a great many of neurons in a grid may modify and perhaps improve the information quality, in contrast to few neurons in isolation, is critical for the rational design of cell-materials interfaces for applications in regenerative medicine, tissue engineering, and personalized lab-on-a-chips. In the present paper, we couple an integrate-and-fire model with information theory variables to analyse the extent of information in a network of nerve cells. We provide an estimate of the information in the network in bits as a function of cell density and short-term depression time. In the model, neurons are connected through a Delaunay triangulation of not-intersecting edges; in doing so, the number of connecting synapses per neuron is approximately constant to reproduce the early time of network development in planar neural cell cultures. In simulations where the number of nodes is varied, we observe an optimal value of cell density for which information in the grid is maximized. In simulations in which the posttransmission latency time is varied, we observe that information increases as the latency time decreases and, for specific configurations of the grid, it is largely enhanced in a resonance effect.

          Related collections

          Most cited references32

          • Record: found
          • Abstract: found
          • Article: not found

          Neural networks and physical systems with emergent collective computational abilities.

          J Hopfield (1982)
          Computational properties of use of biological organisms or to the construction of computers can emerge as collective properties of systems having a large number of simple equivalent components (or neurons). The physical meaning of content-addressable memory is described by an appropriate phase space flow of the state of a system. A model of such a system is given, based on aspects of neurobiology but readily adapted to integrated circuits. The collective properties of this model produce a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size. The algorithm for the time evolution of the state of the system is based on asynchronous parallel processing. Additional emergent collective properties include some capacity for generalization, familiarity recognition, categorization, error correction, and time sequence retention. The collective properties are only weakly sensitive to details of the modeling or the failure of individual devices.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Short-term synaptic plasticity.

            Synaptic transmission is a dynamic process. Postsynaptic responses wax and wane as presynaptic activity evolves. This prominent characteristic of chemical synaptic transmission is a crucial determinant of the response properties of synapses and, in turn, of the stimulus properties selected by neural networks and of the patterns of activity generated by those networks. This review focuses on synaptic changes that result from prior activity in the synapse under study, and is restricted to short-term effects that last for at most a few minutes. Forms of synaptic enhancement, such as facilitation, augmentation, and post-tetanic potentiation, are usually attributed to effects of a residual elevation in presynaptic [Ca(2+)]i, acting on one or more molecular targets that appear to be distinct from the secretory trigger responsible for fast exocytosis and phasic release of transmitter to single action potentials. We discuss the evidence for this hypothesis, and the origins of the different kinetic phases of synaptic enhancement, as well as the interpretation of statistical changes in transmitter release and roles played by other factors such as alterations in presynaptic Ca(2+) influx or postsynaptic levels of [Ca(2+)]i. Synaptic depression dominates enhancement at many synapses. Depression is usually attributed to depletion of some pool of readily releasable vesicles, and various forms of the depletion model are discussed. Depression can also arise from feedback activation of presynaptic receptors and from postsynaptic processes such as receptor desensitization. In addition, glial-neuronal interactions can contribute to short-term synaptic plasticity. Finally, we summarize the recent literature on putative molecular players in synaptic plasticity and the effects of genetic manipulations and other modulatory influences.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              The Dynamic Brain: From Spiking Neurons to Neural Masses and Cortical Fields

              The cortex is a complex system, characterized by its dynamics and architecture, which underlie many functions such as action, perception, learning, language, and cognition. Its structural architecture has been studied for more than a hundred years; however, its dynamics have been addressed much less thoroughly. In this paper, we review and integrate, in a unifying framework, a variety of computational approaches that have been used to characterize the dynamics of the cortex, as evidenced at different levels of measurement. Computational models at different space–time scales help us understand the fundamental mechanisms that underpin neural processes and relate these processes to neuroscience data. Modeling at the single neuron level is necessary because this is the level at which information is exchanged between the computing elements of the brain; the neurons. Mesoscopic models tell us how neural elements interact to yield emergent behavior at the level of microcolumns and cortical columns. Macroscopic models can inform us about whole brain dynamics and interactions between large-scale neural systems such as cortical regions, the thalamus, and brain stem. Each level of description relates uniquely to neuroscience data, from single-unit recordings, through local field potentials to functional magnetic resonance imaging (fMRI), electroencephalogram (EEG), and magnetoencephalogram (MEG). Models of the cortex can establish which types of large-scale neuronal networks can perform computations and characterize their emergent properties. Mean-field and related formulations of dynamics also play an essential and complementary role as forward models that can be inverted given empirical data. This makes dynamic models critical in integrating theory and experiments. We argue that elaborating principled and informed models is a prerequisite for grounding empirical neuroscience in a cogent theoretical framework, commensurate with the achievements in the physical sciences.
                Bookmark

                Author and article information

                Journal
                Biomed Res Int
                Biomed Res Int
                BMRI
                BioMed Research International
                Hindawi Publishing Corporation
                2314-6133
                2314-6141
                2016
                14 June 2016
                : 2016
                : 2769698
                Affiliations
                1Department of Experimental and Clinical Medicine, University of Magna Graecia, 88100 Catanzaro, Italy
                2King Abdullah University of Science and Technology, Thuwal 23955-6900, Saudi Arabia
                3Department of Electrical Engineering and Information Technology, University of Naples, 80125 Naples, Italy
                Author notes

                Academic Editor: Maria G. Knyazeva

                Article
                10.1155/2016/2769698
                4923608
                27403421
                dce0a714-1ca4-4554-99eb-ea02013d7286
                Copyright © 2016 Valentina Onesto et al.

                This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

                History
                : 17 December 2015
                : 10 May 2016
                Categories
                Research Article

                Comments

                Comment on this article