21
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      A deep learning framework for financial time series using stacked autoencoders and long-short term memory

      research-article
      1 , 2 , * , 1
      PLoS ONE
      Public Library of Science

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The application of deep learning approaches to finance has received a great deal of attention from both investors and researchers. This study presents a novel deep learning framework where wavelet transforms (WT), stacked autoencoders (SAEs) and long-short term memory (LSTM) are combined for stock price forecasting. The SAEs for hierarchically extracted deep features is introduced into stock price forecasting for the first time. The deep learning framework comprises three stages. First, the stock price time series is decomposed by WT to eliminate noise. Second, SAEs is applied to generate deep high-level features for predicting the stock price. Third, high-level denoising features are fed into LSTM to forecast the next day’s closing price. Six market indices and their corresponding index futures are chosen to examine the performance of the proposed model. Results show that the proposed model outperforms other similar models in both predictive accuracy and profitability performance.

          Related collections

          Most cited references87

          • Record: found
          • Abstract: found
          • Article: not found

          Long Short-Term Memory

          Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to open and close access to the constant error flow. LSTM is local in space and time; its computational complexity per time step and weight is O(1). Our experiments with artificial data involve local, distributed, real-valued, and noisy pattern representations. In comparisons with real-time recurrent learning, back propagation through time, recurrent cascade correlation, Elman nets, and neural sequence chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex, artificial long-time-lag tasks that have never been solved by previous recurrent network algorithms.
            Bookmark
            • Record: found
            • Abstract: not found
            • Conference Proceedings: not found

            Going deeper with convolutions

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Reducing the dimensionality of data with neural networks.

              High-dimensional data can be converted to low-dimensional codes by training a multilayer neural network with a small central layer to reconstruct high-dimensional input vectors. Gradient descent can be used for fine-tuning the weights in such "autoencoder" networks, but this works well only if the initial weights are close to a good solution. We describe an effective way of initializing the weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data.
                Bookmark

                Author and article information

                Contributors
                Role: Editor
                Journal
                PLoS One
                PLoS ONE
                plos
                plosone
                PLoS ONE
                Public Library of Science (San Francisco, CA USA )
                1932-6203
                14 July 2017
                2017
                : 12
                : 7
                : e0180944
                Affiliations
                [1 ] Business School, Central South University, Changsha, China
                [2 ] Institute of Remote Sensing and Geographic Information System, Peking University, Beijing, China
                University of Rijeka, CROATIA
                Author notes

                Competing Interests: The authors have declared that no competing interests exist.

                • Conceptualization: WB JY YR.

                • Data curation: WB.

                • Funding acquisition: YR.

                • Methodology: WB JY YR.

                • Project administration: JY WB YR.

                • Software: JY WB.

                • Writing – original draft: WB JY YR.

                • Writing – review & editing: WB JY YR.

                Author information
                http://orcid.org/0000-0002-6465-5052
                Article
                PONE-D-16-50177
                10.1371/journal.pone.0180944
                5510866
                28708865
                79e63df8-0b5f-4565-ab5c-91c2aac61cd7
                © 2017 Bao et al

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

                History
                : 20 December 2016
                : 10 June 2017
                Page count
                Figures: 8, Tables: 6, Pages: 24
                Funding
                Funded by: National Natural Science Foundation of China
                Award ID: 71372063
                Award Recipient :
                Funded by: funder-id http://dx.doi.org/10.13039/501100001809, National Natural Science Foundation of China;
                Award ID: 71673306
                Award Recipient :
                This work is supported by National Natural Science Foundation of China (Grant Number: 71372063 and 71673306, http://www.nsfc.gov.cn/). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
                Categories
                Research Article
                Social Sciences
                Economics
                Finance
                Biology and Life Sciences
                Neuroscience
                Cognitive Science
                Cognition
                Memory
                Biology and Life Sciences
                Neuroscience
                Learning and Memory
                Memory
                Computer and Information Sciences
                Neural Networks
                Biology and Life Sciences
                Neuroscience
                Neural Networks
                Research and Analysis Methods
                Mathematical and Statistical Techniques
                Statistical Methods
                Forecasting
                Physical Sciences
                Mathematics
                Statistics (Mathematics)
                Statistical Methods
                Forecasting
                Research and Analysis Methods
                Mathematical and Statistical Techniques
                Mathematical Functions
                Wavelet Transforms
                Social Sciences
                Economics
                Financial Markets
                Capital Markets
                Stock Markets
                Biology and Life Sciences
                Neuroscience
                Cognitive Science
                Cognitive Psychology
                Learning
                Biology and Life Sciences
                Psychology
                Cognitive Psychology
                Learning
                Social Sciences
                Psychology
                Cognitive Psychology
                Learning
                Biology and Life Sciences
                Neuroscience
                Learning and Memory
                Learning
                Social Sciences
                Economics
                Financial Markets
                Custom metadata
                All data are available from figshare database (DOI: 10.6084/m9.figshare.5028110).

                Uncategorized
                Uncategorized

                Comments

                Comment on this article