Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
24
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Fractal Spiking Neural Network Scheme for EEG-Based Emotion Recognition

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Electroencephalogram (EEG)-based emotion recognition is of great significance for aiding in clinical diagnosis, treatment, nursing and rehabilitation. Current research on this issue mainly focuses on utilizing various network architectures with different types of neurons to exploit the temporal, spectral, or spatial information from EEG for classification. However, most studies fail to take full advantage of the useful Temporal-Spectral-Spatial (TSS) information of EEG signals. In this paper, we propose a novel and effective Fractal Spike Neural Network (Fractal-SNN) scheme, which can exploit the multi-scale TSS information from EEG, for emotion recognition. Our designed Fractal-SNN block in the proposed scheme approximately simulates the biological neural connection structures based on spiking neurons and a new fractal rule, allowing for the extraction of discriminative multi-scale TSS features from the signals. Our designed training technique, inverted drop-path, can enhance the generalization ability of the Fractal-SNN scheme. Sufficient experiments on four public benchmark databases, DREAMER, DEAP, SEED-IV and MPED, under the subject-dependent protocols demonstrate the superiority of the proposed scheme over the related advanced methods. In summary, the proposed scheme provides a promising solution for EEG-based emotion recognition.

          Related collections

          Most cited references37

          • Record: found
          • Abstract: found
          • Article: not found

          DEAP: A Database for Emotion Analysis ;Using Physiological Signals

          IEEE Transactions on Affective Computing, 3(1), 18-31
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Neural Machine Translation by Jointly Learning to Align and Translate

            Neural machine translation is a recently proposed approach to machine translation. Unlike the traditional statistical machine translation, the neural machine translation aims at building a single neural network that can be jointly tuned to maximize the translation performance. The models proposed recently for neural machine translation often belong to a family of encoder-decoders and consists of an encoder that encodes a source sentence into a fixed-length vector from which a decoder generates a translation. In this paper, we conjecture that the use of a fixed-length vector is a bottleneck in improving the performance of this basic encoder-decoder architecture, and propose to extend this by allowing a model to automatically (soft-)search for parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly. With this new approach, we achieve a translation performance comparable to the existing state-of-the-art phrase-based system on the task of English-to-French translation. Furthermore, qualitative analysis reveals that the (soft-)alignments found by the model agree well with our intuition. Accepted at ICLR 2015 as oral presentation
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              EmotionMeter: A Multimodal Framework for Recognizing Human Emotions

              In this paper, we present a multimodal emotion recognition framework called EmotionMeter that combines brain waves and eye movements. To increase the feasibility and wearability of EmotionMeter in real-world applications, we design a six-electrode placement above the ears to collect electroencephalography (EEG) signals. We combine EEG and eye movements for integrating the internal cognitive states and external subconscious behaviors of users to improve the recognition accuracy of EmotionMeter. The experimental results demonstrate that modality fusion with multimodal deep neural networks can significantly enhance the performance compared with a single modality, and the best mean accuracy of 85.11% is achieved for four emotions (happy, sad, fear, and neutral). We explore the complementary characteristics of EEG and eye movements for their representational capacities and identify that EEG has the advantage of classifying happy emotion, whereas eye movements outperform EEG in recognizing fear emotion. To investigate the stability of EmotionMeter over time, each subject performs the experiments three times on different days. EmotionMeter obtains a mean recognition accuracy of 72.39% across sessions with the six-electrode EEG and eye movement features. These experimental results demonstrate the effectiveness of EmotionMeter within and between sessions.
                Bookmark

                Author and article information

                Contributors
                Journal
                IEEE J Transl Eng Health Med
                IEEE J Transl Eng Health Med
                0063400
                JTEHM
                IJTEBN
                IEEE Journal of Translational Engineering in Health and Medicine
                IEEE
                2168-2372
                2024
                28 September 2023
                : 12
                : 106-118
                Affiliations
                [1] divisionSchool of Instrument Science and Engineering, institutionSoutheast University; Nanjing Jiangsu 210096 China
                Article
                JTEHM-00061-2023
                10.1109/JTEHM.2023.3320132
                10712674
                38088998
                ec8f018e-1c94-4269-a85f-76ee1ab49c38
                © 2023 The Authors

                This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/

                History
                : 08 May 2023
                : 17 September 2023
                : 19 September 2023
                : 24 November 2023
                Page count
                Figures: 13, Tables: 9, Equations: 329, References: 37, Pages: 13
                Funding
                Funded by: Basic Research Project of Leading Technology of Jiangsu Province;
                Award ID: BK20192004
                This work was supported by the Basic Research Project of Leading Technology of Jiangsu Province under Grant BK20192004.
                Categories
                Article

                electroencephalogram,fractal spiking neural network,inverted drop-path,emotion recognition

                Comments

                Comment on this article