graykode / nlp-tutorial

Natural Language Processing Tutorial for Deep Learning Researchers
https://www.reddit.com/r/MachineLearning/comments/amfinl/project_nlptutoral_repository_who_is_studying/
MIT License
14.31k stars 3.95k forks source link

A question about seq2seq with attention #28

Closed wking-tao closed 5 years ago

wking-tao commented 5 years ago

hi, I have a question about the way to calculate attention_weights. https://github.com/graykode/nlp-tutorial/blob/master/4-2.Seq2Seq(Attention)/Seq2Seq(Attention)-Torch.py

屏幕快照 2019-05-27 下午5 31 15

in line 60, the attn_weights is calculated by dec_output and enc_outputs in your code, why not dec_hidden and enc_hidden?