Closed wking-tao closed 5 years ago
hi, I have a question about the way to calculate attention_weights. https://github.com/graykode/nlp-tutorial/blob/master/4-2.Seq2Seq(Attention)/Seq2Seq(Attention)-Torch.py
in line 60, the attn_weights is calculated by dec_output and enc_outputs in your code, why not dec_hidden and enc_hidden?
hi, I have a question about the way to calculate attention_weights. https://github.com/graykode/nlp-tutorial/blob/master/4-2.Seq2Seq(Attention)/Seq2Seq(Attention)-Torch.py
in line 60, the attn_weights is calculated by dec_output and enc_outputs in your code, why not dec_hidden and enc_hidden?