12
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Circuit Investigations With Open-Source Miniaturized Microscopes: Past, Present and Future

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The ability to simultaneously image the spatiotemporal activity signatures from many neurons during unrestrained vertebrate behaviors has become possible through the development of miniaturized fluorescence microscopes, or miniscopes, sufficiently light to be carried by small animals such as bats, birds and rodents. Miniscopes have permitted the study of circuits underlying song vocalization, action sequencing, head-direction tuning, spatial memory encoding and sleep to name a few. The foundation for these microscopes has been laid over the last two decades through academic research with some of this work resulting in commercialization. More recently, open-source initiatives have led to an even broader adoption of miniscopes in the neuroscience community. Open-source designs allow for rapid modification and extension of their function, which has resulted in a new generation of miniscopes that now permit wire-free or wireless recording, concurrent electrophysiology and imaging, two-color fluorescence detection, simultaneous optical actuation and read-out as well as wide-field and volumetric light-field imaging. These novel miniscopes will further expand the toolset of those seeking affordable methods to probe neural circuit function during naturalistic behaviors. Here, we will discuss the early development, present use and future potential of miniscopes.

          Related collections

          Most cited references73

          • Record: found
          • Abstract: found
          • Article: not found

          DeepLabCut: markerless pose estimation of user-defined body parts with deep learning

          Quantifying behavior is crucial for many applications in neuroscience. Videography provides easy methods for the observation and recording of animal behavior in diverse settings, yet extracting particular aspects of a behavior for further analysis can be highly time consuming. In motor control studies, humans or other animals are often marked with reflective markers to assist with computer-based tracking, but markers are intrusive, and the number and location of the markers must be determined a priori. Here we present an efficient method for markerless pose estimation based on transfer learning with deep neural networks that achieves excellent results with minimal training data. We demonstrate the versatility of this framework by tracking various body parts in multiple species across a broad collection of behaviors. Remarkably, even when only a small number of frames are labeled (~200), the algorithm achieves excellent tracking performance on test frames that is comparable to human accuracy.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Sensitive red protein calcium indicators for imaging neural activity

            Genetically encoded calcium indicators (GECIs) allow measurement of activity in large populations of neurons and in small neuronal compartments, over times of milliseconds to months. Although GFP-based GECIs are widely used for in vivo neurophysiology, GECIs with red-shifted excitation and emission spectra have advantages for in vivo imaging because of reduced scattering and absorption in tissue, and a consequent reduction in phototoxicity. However, current red GECIs are inferior to the state-of-the-art GFP-based GCaMP6 indicators for detecting and quantifying neural activity. Here we present improved red GECIs based on mRuby (jRCaMP1a, b) and mApple (jRGECO1a), with sensitivity comparable to GCaMP6. We characterized the performance of the new red GECIs in cultured neurons and in mouse, Drosophila, zebrafish and C. elegans in vivo. Red GECIs facilitate deep-tissue imaging, dual-color imaging together with GFP-based reporters, and the use of optogenetics in combination with calcium imaging. DOI: http://dx.doi.org/10.7554/eLife.12727.001
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Ultrafast neuronal imaging of dopamine dynamics with designed genetically encoded sensors

              Neuromodulatory systems exert profound influences on brain function. Understanding how these systems modify the operating mode of target circuits requires measuring spatiotemporally precise neuromodulator release. We developed dLight1, an intensity-based genetically encoded dopamine indicator, to enable optical recording of dopamine dynamics with high spatiotemporal resolution in behaving mice. We demonstrated the utility of dLight1 by imaging dopamine dynamics simultaneously with pharmacological manipulation, electrophysiological or optogenetic stimulation, and calcium imaging of local neuronal activity. dLight1 enabled chronic tracking of learning-induced changes in millisecond dopamine transients in striatum. Further, we used dLight1 to image spatially distinct, functionally heterogeneous dopamine transients relevant to learning and motor control in cortex. We also validated our sensor design platform for developing norepinephrine, serotonin, melatonin, and opioid neuropeptide indicators.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Cell Neurosci
                Front Cell Neurosci
                Front. Cell. Neurosci.
                Frontiers in Cellular Neuroscience
                Frontiers Media S.A.
                1662-5102
                05 April 2019
                2019
                : 13
                : 141
                Affiliations
                [1] 1Department of Neurology, David Geffen School of Medicine, University of California, Los Angeles , Los Angeles, CA, United States
                [2] 2Department of Neuroscience, Erasmus Medical Center , Rotterdam, Netherlands
                [3] 3Netherlands Institute for Neuroscience, Royal Netherlands Academy of Arts and Sciences , Amsterdam, Netherlands
                Author notes

                Edited by: Philippe Isope, Centre National de la Recherche Scientifique (CNRS), France

                Reviewed by: Leonardo Sacconi, University of Florence, Italy; Takashi Tominaga, Tokushima Bunri University, Japan; Romain Goutagny, UMR7364 Laboratoire de Neurosciences Cognitives et Adaptatives (LNCA), France

                *Correspondence: Daniel Aharoni dbaharoni@ 123456gmail.com Tycho M. Hoogland tmhoogland@ 123456gmail.com
                Article
                10.3389/fncel.2019.00141
                6461004
                31024265
                e69988a7-a78e-48c0-8b72-cf384a8cb7d1
                Copyright © 2019 Aharoni and Hoogland.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 23 January 2019
                : 20 March 2019
                Page count
                Figures: 5, Tables: 0, Equations: 0, References: 93, Pages: 12, Words: 10009
                Categories
                Neuroscience
                Review

                Neurosciences
                miniscope,behavior,freely moving animals,open-source,miniaturization,3d printing,systems neurobiology

                Comments

                Comment on this article