11
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Unsupervised neural networks for automatic Arabic text summarization using document clustering and topic modeling

      , , , ,
      Expert Systems with Applications
      Elsevier BV

      Read this article at

      ScienceOpenPublisher
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references55

          • Record: found
          • Abstract: found
          • Article: not found

          Reducing the dimensionality of data with neural networks.

          High-dimensional data can be converted to low-dimensional codes by training a multilayer neural network with a small central layer to reconstruct high-dimensional input vectors. Gradient descent can be used for fine-tuning the weights in such "autoencoder" networks, but this works well only if the initial weights are close to a good solution. We describe an effective way of initializing the weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy.

            Feature selection is an important problem for pattern classification systems. We study how to select good features according to the maximal statistical dependency criterion based on mutual information. Because of the difficulty in directly implementing the maximal dependency condition, we first derive an equivalent form, called minimal-redundancy-maximal-relevance criterion (mRMR), for first-order incremental feature selection. Then, we present a two-stage feature selection algorithm by combining mRMR and other more sophisticated feature selectors (e.g., wrappers). This allows us to select a compact set of superior features at very low cost. We perform extensive experimental comparison of our algorithm and other methods using three different classifiers (naive Bayes, support vector machine, and linear discriminate analysis) and four different data sets (handwritten digits, arrhythmia, NCI cancer cell lines, and lymphoma tissues). The results confirm that mRMR leads to promising improvement on feature selection and classification accuracy.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              The anatomy of a large-scale hypertextual Web search engine

                Bookmark

                Author and article information

                Journal
                Expert Systems with Applications
                Expert Systems with Applications
                Elsevier BV
                09574174
                June 2021
                June 2021
                : 172
                : 114652
                Article
                10.1016/j.eswa.2021.114652
                9e1a115f-4ece-4766-a2dd-106621282dfe
                © 2021

                https://www.elsevier.com/tdm/userlicense/1.0/

                History

                Comments

                Comment on this article