pemywei / attention-nmt

A simple implementation of attention based encoder-decoder for nmt.
https://github.com/pemywei/attention-nmt
47 stars 24 forks source link

What the different between this project and the one below? #1

Closed guotong1988 closed 7 years ago

guotong1988 commented 7 years ago

https://github.com/ilivans/tf-rnn-attention/blob/master/attention.py The link project doesn't use prev_state as one of the input parameter. @pemywei Thank you!!!

pemywei commented 7 years ago

In decoding phase, the attention mechanism is used to compute align scores between encoder outputs and the current target word, which need to use the previous state of decoder as one of the input parameter. Please refer to 《Neural Machine Translation by Jointly Learning to Align and Translate》 for details.