http://cnyah.com/2017/08/01/attention-variants/
Attention Variants | CN Yah
By Jingxi Liang 2017-01-08 Updated:2017-02-08 The Attention Mechanism has proved itself to be one necessary component of RNN to deal with tasks like NMT, MC, QA and NLI. It might be useful to compare some popular attention variants in NLP field. Luong atte
cnyah.com
What is the difference between Luong Attention and Bahdanau Attention?
These two attentions are used in seq2seq modules. The two different attentions are introduced as multiplicative and additive attentions in this tensorflow documentation. What is the difference?
stackoverflow.com
https://hcnoh.github.io/2018-12-11-bahdanau-attention
[Attention] Bahdanau Attention 개념 정리
이번 포스팅은 다음의 논문을 스터디하여 정리하였다: 링크1
hcnoh.github.io
https://pravn.wordpress.com/2017/11/14/bahdanau-attention/
Bahdanau attention
In an earlier post, I had written about seq2seq without attention by way of introducing the idea. This time, we extend upon that by adding attention to the setup. In the regular seq2seq model, we e…
pravn.wordpress.com
https://towardsdatascience.com/attention-based-neural-machine-translation-b5d129742e2c