1
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      An Air Quality Prediction Model Based on a Noise Reduction Self-Coding Deep Network

      1 , 1 , 1 , 1 , 1
      Mathematical Problems in Engineering
      Hindawi Limited

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Aiming at remedying the problem of low prediction accuracy of existing air pollutant prediction models, a denoising autoencoder deep network (DAEDN) model that is based on long short-term memory (LSTM) networks was designed. This model created a noise reduction autoencoder with an LSTM network to extract the inherent air quality characteristics of original monitoring data and to implement noise reduction processing on monitoring data to improve the accuracy of air quality predictions. The LSTM network structure in the DAEDN model was designed as bidirectional LSTM (Bi-LSTM) to solve the problem of a lag in the unidirectional LSTM prediction results and thereby to further improve the prediction accuracy of the prediction model. Using air pollutant time series data, the DAEDN model was trained using hourly PM 2.5 concentration data collected in Beijing over 5 years. The experimental results show that the DAEDN model can extract more stable features from the noisy input after training was completed. The models were evaluated using RMSE and MAE, and the results show that the indexes are 15.504 and 6.789; compared with unidirectional LSTM, it is reduced by 7.33% and 5.87%, respectively. In addition, the new prediction model essentially considered the time series properties of the prediction of the concentration of spatial pollutants and the fully integrated environmental big data, such as air quality monitoring, meteorological monitoring, and forecasting.

          Related collections

          Most cited references10

          • Record: found
          • Abstract: found
          • Article: not found

          Representational power of restricted boltzmann machines and deep belief networks.

          Deep belief networks (DBN) are generative neural network models with many layers of hidden explanatory factors, recently introduced by Hinton, Osindero, and Teh (2006) along with a greedy layer-wise unsupervised learning algorithm. The building block of a DBN is a probabilistic model called a restricted Boltzmann machine (RBM), used to represent one layer of the model. Restricted Boltzmann machines are interesting because inference is easy in them and because they have been successfully used as building blocks for training deeper models. We first prove that adding hidden units yields strictly improved modeling power, while a second theorem shows that RBMs are universal approximators of discrete distributions. We then study the question of whether DBNs with more layers are strictly more powerful in terms of representational power. This suggests a new and less greedy criterion for training RBMs within DBNs.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Deep learning architecture for air quality predictions.

            With the rapid development of urbanization and industrialization, many developing countries are suffering from heavy air pollution. Governments and citizens have expressed increasing concern regarding air pollution because it affects human health and sustainable development worldwide. Current air quality prediction methods mainly use shallow models; however, these methods produce unsatisfactory results, which inspired us to investigate methods of predicting air quality based on deep architecture models. In this paper, a novel spatiotemporal deep learning (STDL)-based air quality prediction method that inherently considers spatial and temporal correlations is proposed. A stacked autoencoder (SAE) model is used to extract inherent air quality features, and it is trained in a greedy layer-wise manner. Compared with traditional time series prediction models, our model can predict the air quality of all stations simultaneously and shows the temporal stability in all seasons. Moreover, a comparison with the spatiotemporal artificial neural network (STANN), auto regression moving average (ARMA), and support vector regression (SVR) models demonstrates that the proposed method of performing air quality predictions has a superior performance.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Dynamically pre-trained deep recurrent neural networks using environmental monitoring data for predicting PM2.5

              Fine particulate matter ( \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\hbox {PM}_{2.5}$$\end{document} PM 2.5 ) has a considerable impact on human health, the environment and climate change. It is estimated that with better predictions, US$9 billion can be saved over a 10-year period in the USA (State of the science fact sheet air quality. http://www.noaa.gov/factsheets/new, 2012). Therefore, it is crucial to keep developing models and systems that can accurately predict the concentration of major air pollutants. In this paper, our target is to predict \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\hbox {PM}_{2.5}$$\end{document} PM 2.5 concentration in Japan using environmental monitoring data obtained from physical sensors with improved accuracy over the currently employed prediction models. To do so, we propose a deep recurrent neural network (DRNN) that is enhanced with a novel pre-training method using auto-encoder especially designed for time series prediction. Additionally, sensors selection is performed within DRNN without harming the accuracy of the predictions by taking advantage of the sparsity found in the network. The numerical experiments show that DRNN with our proposed pre-training method is superior than when using a canonical and a state-of-the-art auto-encoder training method when applied to time series prediction. The experiments confirm that when compared against the \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\hbox {PM}_{2.5}$$\end{document} PM 2.5 prediction system VENUS (National Institute for Environmental Studies. Visual Atmospheric Environment Utility System. http://envgis5.nies.go.jp/osenyosoku/, 2014), our technique improves the accuracy of \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\hbox {PM}_{2.5}$$\end{document} PM 2.5 concentration level predictions that are being reported in Japan.
                Bookmark

                Author and article information

                Journal
                Mathematical Problems in Engineering
                Mathematical Problems in Engineering
                Hindawi Limited
                1024-123X
                1563-5147
                May 15 2020
                May 15 2020
                : 2020
                : 1-12
                Affiliations
                [1 ]College of Electronic Science and Control Engineering, Institute of Disaster Prevention, Sanhe 065201, China
                Article
                10.1155/2020/3507197
                17ee7a95-6c94-4533-b69b-b8fa221e1452
                © 2020

                http://creativecommons.org/licenses/by/4.0/

                History

                Comments

                Comment on this article