14
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Deep-Channel uses deep neural networks to detect single-molecule events from patch-clamp data

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Single-molecule research techniques such as patch-clamp electrophysiology deliver unique biological insight by capturing the movement of individual proteins in real time, unobscured by whole-cell ensemble averaging. The critical first step in analysis is event detection, so called “idealisation”, where noisy raw data are turned into discrete records of protein movement. To date there have been practical limitations in patch-clamp data idealisation; high quality idealisation is typically laborious and becomes infeasible and subjective with complex biological data containing many distinct native single-ion channel proteins gating simultaneously. Here, we show a deep learning model based on convolutional neural networks and long short-term memory architecture can automatically idealise complex single molecule activity more accurately and faster than traditional methods. There are no parameters to set; baseline, channel amplitude or numbers of channels for example. We believe this approach could revolutionise the unsupervised automatic detection of single-molecule transition events in the future.

          Abstract

          Numan Celik et al. present a deep learning model that automatically detects single-molecule events against a noisy background in patch-clamp electrophysiological data, based on convolutional neural networks and long short-term memory architecture. This algorithm represents a step torward a fully automated electrophysiological platform.

          Related collections

          Most cited references41

          • Record: found
          • Abstract: found
          • Article: not found

          Deep learning.

          Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in speech recognition, visual object recognition, object detection and many other domains such as drug discovery and genomics. Deep learning discovers intricate structure in large data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer. Deep convolutional nets have brought about breakthroughs in processing images, video, speech and audio, whereas recurrent nets have shone light on sequential data such as text and speech.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Long Short-Term Memory

            Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to open and close access to the constant error flow. LSTM is local in space and time; its computational complexity per time step and weight is O(1). Our experiments with artificial data involve local, distributed, real-valued, and noisy pattern representations. In comparisons with real-time recurrent learning, back propagation through time, recurrent cascade correlation, Elman nets, and neural sequence chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex, artificial long-time-lag tasks that have never been solved by previous recurrent network algorithms.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              SMOTE: Synthetic Minority Over-sampling Technique

              An approach to the construction of classifiers from imbalanced datasets is described. A dataset is imbalanced if the classification categories are not approximately equally represented. Often real-world data sets are predominately composed of ``normal'' examples with only a small percentage of ``abnormal'' or ``interesting'' examples. It is also the case that the cost of misclassifying an abnormal (interesting) example as a normal example is often much higher than the cost of the reverse error. Under-sampling of the majority (normal) class has been proposed as a good means of increasing the sensitivity of a classifier to the minority class. This paper shows that a combination of our method of over-sampling the minority (abnormal) class and under-sampling the majority (normal) class can achieve better classifier performance (in ROC space) than only under-sampling the majority class. This paper also shows that a combination of our method of over-sampling the minority class and under-sampling the majority class can achieve better classifier performance (in ROC space) than varying the loss ratios in Ripper or class priors in Naive Bayes. Our method of over-sampling the minority class involves creating synthetic minority class examples. Experiments are performed using C4.5, Ripper and a Naive Bayes classifier. The method is evaluated using the area under the Receiver Operating Characteristic curve (AUC) and the ROC convex hull strategy.
                Bookmark

                Author and article information

                Contributors
                rbj@liverpool.ac.uk
                Journal
                Commun Biol
                Commun Biol
                Communications Biology
                Nature Publishing Group UK (London )
                2399-3642
                7 January 2020
                7 January 2020
                2020
                : 3
                : 3
                Affiliations
                [1 ]ISNI 0000 0004 1936 8470, GRID grid.10025.36, Faculty of Health and Life Science, , University of Liverpool, ; Liverpool, UK
                [2 ]ISNI 0000 0004 1936 8470, GRID grid.10025.36, Department of Computer Science, , University of Liverpool, ; Liverpool, UK
                Author information
                http://orcid.org/0000-0003-1813-1036
                http://orcid.org/0000-0002-0532-1992
                http://orcid.org/0000-0003-0449-9972
                Article
                729
                10.1038/s42003-019-0729-3
                6946689
                31925311
                116e3b68-f2ed-4ba6-b5c6-8a09396f6cf7
                © The Author(s) 2020

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 6 August 2019
                : 6 December 2019
                Funding
                Funded by: FundRef https://doi.org/10.13039/501100000268, RCUK | Biotechnology and Biological Sciences Research Council (BBSRC);
                Award ID: BB/R022143/1
                Award ID: BB/N003020/1
                Award Recipient :
                Categories
                Article
                Custom metadata
                © The Author(s) 2020

                ion transport,machine learning
                ion transport, machine learning

                Comments

                Comment on this article