14
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Interpretation of Entropy Algorithms in the Context of Biomedical Signal Analysis and Their Application to EEG Analysis in Epilepsy

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Biomedical signals are measurable time series that describe a physiological state of a biological system. Entropy algorithms have been previously used to quantify the complexity of biomedical signals, but there is a need to understand the relationship of entropy to signal processing concepts. In this study, ten synthetic signals that represent widely encountered signal structures in the field of signal processing were created to interpret permutation, modified permutation, sample, quadratic sample and fuzzy entropies. Subsequently, the entropy algorithms were applied to two different databases containing electroencephalogram (EEG) signals from epilepsy studies. Transitions from randomness to periodicity were successfully detected in the synthetic signals, while significant differences in EEG signals were observed based on different regions and states of the brain. In addition, using results from one entropy algorithm as features and the k-nearest neighbours algorithm, maximum classification accuracies in the first EEG database ranged from 63% to 73.5%, while these values increased by approximately 20% when using two different entropies as features. For the second database, maximum classification accuracy reached 62.5% using one entropy algorithm, while using two algorithms as features further increased that by 10%. Embedding entropies (sample, quadratic sample and fuzzy entropies) are found to outperform the rest of the algorithms in terms of sensitivity and show greater potential by considering the fine-tuning possibilities they offer. On the other hand, permutation and modified permutation entropies are more consistent across different input parameter values and considerably faster to calculate.

          Related collections

          Most cited references42

          • Record: found
          • Abstract: found
          • Article: not found

          Physiological time-series analysis using approximate entropy and sample entropy.

          Entropy, as it relates to dynamical systems, is the rate of information production. Methods for estimation of the entropy of a system represented by a time series are not, however, well suited to analysis of the short and noisy data sets encountered in cardiovascular and other biological studies. Pincus introduced approximate entropy (ApEn), a set of measures of system complexity closely related to entropy, which is easily applied to clinical cardiovascular and other time series. ApEn statistics, however, lead to inconsistent results. We have developed a new and related complexity measure, sample entropy (SampEn), and have compared ApEn and SampEn by using them to analyze sets of random numbers with known probabilistic character. We have also evaluated cross-ApEn and cross-SampEn, which use cardiovascular data sets to measure the similarity of two distinct time series. SampEn agreed with theory much more closely than ApEn over a broad range of conditions. The improved accuracy of SampEn statistics should make them useful in the study of experimental clinical cardiovascular and other biological time series.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Approximate entropy as a measure of system complexity.

            Techniques to determine changing system complexity from data are evaluated. Convergence of a frequently used correlation dimension algorithm to a finite value does not necessarily imply an underlying deterministic model or chaos. Analysis of a recently developed family of formulas and statistics, approximate entropy (ApEn), suggests that ApEn can classify complex systems, given at least 1000 data values in diverse settings that include both deterministic chaotic and stochastic processes. The capability to discern changing complexity from such a relatively small amount of data holds promise for applications of ApEn in a variety of contexts.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Permutation Entropy: A Natural Complexity Measure for Time Series

              We introduce complexity parameters for time series based on comparison of neighboring values. The definition directly applies to arbitrary real-world data. For some well-known chaotic dynamical systems it is shown that our complexity behaves similar to Lyapunov exponents, and is particularly useful in the presence of dynamical or observational noise. The advantages of our method are its simplicity, extremely fast calculation, robustness, and invariance with respect to nonlinear monotonous transformations.
                Bookmark

                Author and article information

                Journal
                Entropy (Basel)
                Entropy (Basel)
                entropy
                Entropy
                MDPI
                1099-4300
                27 August 2019
                September 2019
                : 21
                : 9
                : 840
                Affiliations
                [1 ]Centre for Biomedical Engineering, Department of Mechanical Engineering Sciences, Faculty of Engineering and Physical Sciences, University of Surrey, Guildford GU2 7XH, UK
                [2 ]Ericsson, Thames Tower, Station Road, Reading RG1 1LX, UK
                Author notes
                [* ]Correspondence: d.abasolo@ 123456surrey.ac.uk ; Tel.: +44-(0)1483-682971
                Author information
                https://orcid.org/0000-0002-4268-2885
                Article
                entropy-21-00840
                10.3390/e21090840
                7515369
                d7f25adb-153d-4d16-9170-d35c9cab1869
                © 2019 by the authors.

                Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license ( http://creativecommons.org/licenses/by/4.0/).

                History
                : 29 June 2019
                : 23 August 2019
                Categories
                Article

                permutation entropy,modified permutation entropy,sample entropy,quadratic entropy,fuzzy entropy,electroencephalogram,non-linear analysis

                Comments

                Comment on this article