13
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Robust Speech Emotion Recognition Using CNN+LSTM Based on Stochastic Fractal Search Optimization Algorithm

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references60

          • Record: found
          • Abstract: not found
          • Article: not found

          Backpropagation Applied to Handwritten Zip Code Recognition

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

            Training Deep Neural Networks is complicated by the fact that the distribution of each layer's inputs changes during training, as the parameters of the previous layers change. This slows down the training by requiring lower learning rates and careful parameter initialization, and makes it notoriously hard to train models with saturating nonlinearities. We refer to this phenomenon as internal covariate shift, and address the problem by normalizing layer inputs. Our method draws its strength from making normalization a part of the model architecture and performing the normalization for each training mini-batch. Batch Normalization allows us to use much higher learning rates and be less careful about initialization. It also acts as a regularizer, in some cases eliminating the need for Dropout. Applied to a state-of-the-art image classification model, Batch Normalization achieves the same accuracy with 14 times fewer training steps, and beats the original model by a significant margin. Using an ensemble of batch-normalized networks, we improve upon the best published result on ImageNet classification: reaching 4.9% top-5 validation error (and 4.8% test error), exceeding the accuracy of human raters.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Model selection for ecologists: the worldviews of AIC and BIC.

                Bookmark

                Author and article information

                Contributors
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                Journal
                IEEE Access
                IEEE Access
                Institute of Electrical and Electronics Engineers (IEEE)
                2169-3536
                2022
                2022
                : 10
                : 49265-49284
                Affiliations
                [1 ]Department of Computer Science, Faculty of Computer and Information Sciences, Ain Shams University, Cairo, Egypt
                [2 ]Department of Communications and Electronics, Delta Higher Institute of Engineering and Technology (DHIET), Mansoura, Egypt
                [3 ]Department of Information Technology, Faculty of Computers and Information Technology, University of Tabuk, Tabuk, Saudi Arabia
                [4 ]Department of Electrical Engineering, Faculty of Engineering, Benha University, Benha, Egypt
                [5 ]Department of Computer Engineering and Control Systems, Faculty of Engineering, Mansoura University, Mansoura, Egypt
                [6 ]Faculty of Artificial Intelligence, Delta University for Science and Technology, Mansoura, Egypt
                Article
                10.1109/ACCESS.2022.3172954
                9fd83a43-29c5-477c-bf01-f6dce2997097
                © 2022

                https://creativecommons.org/licenses/by/4.0/legalcode

                History

                Comments

                Comment on this article