3
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Deep Recurrent Models with Fast-Forward Connections for Neural Machine Translation

      1 , 1 , 1 , 1 , 1
      Transactions of the Association for Computational Linguistics
      MIT Press - Journals

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Neural machine translation (NMT) aims at solving machine translation (MT) problems using neural networks and has exhibited promising results in recent years. However, most of the existing NMT models are shallow and there is still a performance gap between a single NMT model and the best conventional MT system. In this work, we introduce a new type of linear connections, named fast-forward connections, based on deep Long Short-Term Memory (LSTM) networks, and an interleaved bi-directional architecture for stacking the LSTM layers. Fast-forward connections play an essential role in propagating the gradients and building a deep topology of depth 16. On the WMT’14 English-to-French task, we achieve BLEU=37.7 with a single attention model, which outperforms the corresponding single shallow model by 6.2 BLEU points. This is the first time that a single NMT model achieves state-of-the-art performance and outperforms the best conventional model by 0.7 BLEU points. We can still achieve BLEU=36.3 even without using an attention mechanism. After special handling of unknown words and model ensembling, we obtain the best score reported to date on this task with BLEU=40.4. Our models are also validated on the more difficult WMT’14 English-to-German task.

          Related collections

          Author and article information

          Journal
          Transactions of the Association for Computational Linguistics
          Transactions of the Association for Computational Linguistics
          MIT Press - Journals
          2307-387X
          December 2016
          December 2016
          : 4
          : 371-383
          Affiliations
          [1 ]Baidu Research - Institute of Deep Learning, Baidu Inc., Beijing, China,
          Article
          10.1162/tacl_a_00105
          8f55aecd-505d-4193-95a8-00c2b6406307
          © 2016
          History

          Comments

          Comment on this article