2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Optoelectronic synapses based on a triple cation perovskite and Al/MoO 3 interface for neuromorphic information processing†

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Optoelectronic synaptic transistors are attractive for applications in next-generation brain-like computation systems, especially for their visible-light operation and in-sensor computing capabilities. However, from a material perspective, it is difficult to build a device that meets expectations in terms of both its functions and power consumption, prompting the call for greater innovation in materials and device construction. In this study, we innovatively combined a novel perovskite carrier supply layer with an Al/MoO 3 interface carrier regulatory layer to fabricate optoelectronic synaptic devices, namely Al/MoO 3/CsFAMA/ITO transistors. The device could mimic a variety of biological synaptic functions and required ultralow-power consumption during operation with an ultrafast speed of >0.1 μs under an optical stimulus of about 3 fJ, which is equivalent to biological synapses. Moreover, Pavlovian conditioning and visual perception tasks could be implemented using the spike-number-dependent plasticity (SNDP) and spike-rate-dependent plasticity (SRDP). This study suggests that the proposed CsFAMA synapse with an Al/MoO 3 interface has the potential for ultralow-power neuromorphic information processing.

          Abstract

          Schematic of human visual perception, information transmission, and the corresponding CsFAMA optoelectronic synaptic transistors.

          Related collections

          Most cited references60

          • Record: found
          • Abstract: found
          • Article: not found

          Deep learning.

          Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in speech recognition, visual object recognition, object detection and many other domains such as drug discovery and genomics. Deep learning discovers intricate structure in large data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer. Deep convolutional nets have brought about breakthroughs in processing images, video, speech and audio, whereas recurrent nets have shone light on sequential data such as text and speech.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Long Short-Term Memory

            Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to open and close access to the constant error flow. LSTM is local in space and time; its computational complexity per time step and weight is O(1). Our experiments with artificial data involve local, distributed, real-valued, and noisy pattern representations. In comparisons with real-time recurrent learning, back propagation through time, recurrent cascade correlation, Elman nets, and neural sequence chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex, artificial long-time-lag tasks that have never been solved by previous recurrent network algorithms.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Deep learning in neural networks: An overview

              In recent years, deep artificial neural networks (including recurrent ones) have won numerous contests in pattern recognition and machine learning. This historical survey compactly summarizes relevant work, much of it from the previous millennium. Shallow and Deep Learners are distinguished by the depth of their credit assignment paths, which are chains of possibly learnable, causal links between actions and effects. I review deep supervised learning (also recapitulating the history of backpropagation), unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
                Bookmark

                Author and article information

                Journal
                Nanoscale Adv
                Nanoscale Adv
                NA
                NAADAI
                Nanoscale Advances
                RSC
                2516-0230
                6 December 2023
                16 January 2024
                6 December 2023
                : 6
                : 2
                : 559-569
                Affiliations
                [a ] Peng Cheng Laboratory Shenzhen 518055 China lightdong@ 123456yeah.net hyzhu@ 123456bnc.org.cn
                [b ] Center for Micro Nano Systems, School of Information Science and Technology (SIST), Fudan University Shanghai 200433 China aryu@ 123456fudan.edu.cn yqzhan@ 123456fudan.edu.cn
                [c ] Shanghai Engineering Research Center for Broadband Technologies and Applications Shanghai 200436 China
                [d ] State Key Laboratory of Advanced Optical Communication Systems and Networks, Department of Electronics and Frontiers Science Center for Nano-optoelectronics, Peking University Beijing 100080 China
                Author information
                https://orcid.org/0009-0006-6639-2387
                https://orcid.org/0000-0002-5515-2250
                https://orcid.org/0000-0001-8539-8342
                https://orcid.org/0000-0002-2228-3633
                https://orcid.org/0000-0002-2019-6632
                https://orcid.org/0000-0001-7640-3947
                https://orcid.org/0000-0002-7415-6177
                https://orcid.org/0000-0001-8391-2555
                Article
                d3na00677h
                10.1039/d3na00677h
                10790979
                38235083
                d9ecf976-1484-41a8-8b3a-e6da8ee560db
                This journal is © The Royal Society of Chemistry
                History
                : 23 August 2023
                : 6 December 2023
                Page count
                Pages: 11
                Funding
                Funded by: National Natural Science Foundation of China, doi 10.13039/501100001809;
                Award ID: 12004258
                Funded by: National Key Research and Development Program of China, doi 10.13039/501100012166;
                Award ID: 2020YFB1806405
                Funded by: Science and Technology Innovation Plan Of Shanghai Science and Technology Commission, doi 10.13039/501100018625;
                Award ID: 23QB1400100
                Categories
                Chemistry
                Custom metadata
                Paginated Article

                Comments

                Comment on this article