Recurrent Neural Network[Content]



下面的RNN,LSTM,GRU模型圖來自這里
簡單的綜述

1. RNN


圖1.1 標准RNN模型的結構


2. BiRNN


3. LSTM


圖3.1 LSTM模型的結構


4. Clockwork RNN

5. Depth Gated RNN

6. Grid LSTM

7. DRAW

8. RLVM


9. GRU


圖9.1 GRU模型的結構


10. NTM


11. QRNN


圖11.1 f-pooling時候的QRNN結構圖

圖11.2 fo-pooling時候的QRNN結構圖

圖11.3 ifo-pooling時候的QRNN結構圖
點這里,QRNN


12. Persistent RNN


13. SRU


圖13.1 SRU模型的結構
點這里,SRU


參考文獻
0. [RNN&Depth] - Pascanu R, Gulcehre C, Cho K, et al. How to construct deep recurrent neural networks[J]. arXiv preprint arXiv:1312.6026, 2013.
0. [survey] - Lipton Z C, Berkowitz J, Elkan C. A critical review of recurrent neural networks for sequence learning[J]. arXiv preprint arXiv:1506.00019, 2015.
.. [survey] - Jozefowicz R, Zaremba W, Sutskever I. An empirical exploration of recurrent network architectures[C]//Proceedings of the 32nd International Conference on Machine Learning (ICML-15). 2015: 2342-2350.
.. [survey] - Greff K, Srivastava R K, Koutník J, et al. LSTM: A search space odyssey[J]. IEEE transactions on neural networks and learning systems, 2017.
.. [survey] - Karpathy A, Johnson J, Fei-Fei L. Visualizing and understanding recurrent networks[J]. arXiv preprint arXiv:1506.02078, 2015.

  1. [RNN] - Elman, Jeffrey L. “Finding structure in time.” Cognitive science 14.2 (1990): 179-211.
  2. [BiRNN] - Schuster, Mike, and Kuldip K. Paliwal. “Bidirectional recurrent neural networks.” IEEE Transactions on Signal Processing 45.11 (1997): 2673-2681.
  3. [LSTM] - Hochreiter, Sepp, and Jürgen Schmidhuber. “Long short-term memory.” Neural computation 9.8 (1997): 1735-1780
    .. [LSTM] - 理解 LSTM 網絡
    .. [LSTM Variants] - Gers F A, Schmidhuber J. Recurrent nets that time and count[C]//Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on. IEEE, 2000, 3: 189-194.
  4. [Multi-dimensional RNN] - Alex Graves, Santiago Fernandez, and Jurgen Schmidhuber, Multi-Dimensional Recurrent Neural Networks, ICANN 2007
  5. [GFRNN] - Junyoung Chung, Caglar Gulcehre, Kyunghyun Cho, Yoshua Bengio, Gated Feedback Recurrent Neural Networks, arXiv:1502.02367 / ICML 2015
  6. [Tree-Structured RNNs] - Kai Sheng Tai, Richard Socher, and Christopher D. Manning, Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks, arXiv:1503.00075 / ACL 2015
    .. [Tree-Structured RNNs] - Samuel R. Bowman, Christopher D. Manning, and Christopher Potts, Tree-structured composition in neural networks without tree-structured architectures, arXiv:1506.04834
  7. [Clockwork RNN] - Koutník J, Greff K, Gomez F, et al. A Clockwork RNN[J]. arXiv preprint arXiv:1402.3511, 2014.
  8. [Depth Gated RNN] - Yao K, Cohn T, Vylomova K, et al. Depth-gated recurrent neural networks[J]. arXiv preprint, 2015.
  9. [Grid LSTM] - Kalchbrenner N, Danihelka I, Graves A. Grid long short-term memory[J]. arXiv preprint arXiv:1507.01526, 2015.
  10. [Segmental RNN] - Lingpeng Kong, Chris Dyer, Noah Smith, "Segmental Recurrent Neural Networks", ICLR 2016.
  11. [**Seq2seq for Sets **] - Oriol Vinyals, Samy Bengio, Manjunath Kudlur, "Order Matters: Sequence to sequence for sets", ICLR 2016.
  12. [Hierarchical Recurrent Neural Networks] - Junyoung Chung, Sungjin Ahn, Yoshua Bengio, "Hierarchical Multiscale Recurrent Neural Networks", arXiv:1609.01704
  13. [DRAW] - Gregor K, Danihelka I, Graves A, et al. DRAW: A recurrent neural network for image generation[J]. arXiv preprint arXiv:1502.04623, 2015.
  14. [RLVM] - Chung J, Kastner K, Dinh L, et al. A recurrent latent variable model for sequential data[C]//Advances in neural information processing systems. 2015: 2980-2988.
  15. [Generate] - Bayer J, Osendorfer C. Learning stochastic recurrent networks[J]. arXiv preprint arXiv:1411.7610, 2014.
  16. [GRU] - Cho K, Van Merriënboer B, Gulcehre C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation[J]. arXiv preprint arXiv:1406.1078, 2014.
    .. [GRU] - Cho K, Van Merriënboer B, Bahdanau D, et al. On the properties of neural machine translation: Encoder-decoder approaches[J]. arXiv preprint arXiv:1409.1259, 2014.
    .. [GRU] - Chung, Junyoung, et al. “Empirical evaluation of gated recurrent neural networks on sequence modeling.” arXiv preprint arXiv:1412.3555 (2014).
  17. [NTM] - Graves, Alex, Greg Wayne, and Ivo Danihelka. “Neural turing machines.” arXiv preprint arXiv:1410.5401 (2014).
  18. [Neural GPU] - Łukasz Kaiser, Ilya Sutskever, arXiv:1511.08228 / ICML 2016 (under review)
  19. [QRNN] - Bradbury J, Merity S, Xiong C, et al. Quasi-recurrent neural networks[J]. arXiv preprint arXiv:1611.01576, 2016.
  20. [Memory Network] - Jason Weston, Sumit Chopra, Antoine Bordes, Memory Networks, arXiv:1410.3916
  21. [Pointer Network] - Oriol Vinyals, Meire Fortunato, and Navdeep Jaitly, Pointer Networks, arXiv:1506.03134 / NIPS 2015
  22. [Deep Attention Recurrent Q-Network] - Ivan Sorokin, Alexey Seleznev, Mikhail Pavlov, Aleksandr Fedorov, Anastasiia Ignateva, Deep Attention Recurrent Q-Network , arXiv:1512.01693
  23. [Dynamic Memory Networks] - Ankit Kumar, Ozan Irsoy, Peter Ondruska, Mohit Iyyer, James Bradbury, Ishaan Gulrajani, Victor Zhong, Romain Paulus, Richard Socher, "Ask Me Anything: Dynamic Memory Networks for Natural Language Processing", arXiv:1506.07285
  24. [SRU] - Lei T, Zhang Y. Training RNNs as Fast as CNNs[J]. arXiv preprint arXiv:1709.02755, 2017.
  25. [知乎] - 如何評價新提出的RNN變種SRU
  26. [attention] - Xu K, Ba J, Kiros R, et al. Show, attend and tell: Neural image caption generation with visual attention[C]//International Conference on Machine Learning. 2015: 2048-2057.
  27. [Persistent RNN] - Diamos G, Sengupta S, Catanzaro B, et al. Persistent rnns: Stashing recurrent weights on-chip[C]//International Conference on Machine Learning. 2016: 2024-2033.
    .. [Persistent RNN] - Diamos G, Sengupta S, Catanzaro B, et al. Persistent RNNs: Stashing Weights on Chip[J]. 2016.
  28. [github] - Awesome Recurrent Neural Networks.


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM