keon / seq2seq

Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch
MIT License
689 stars 172 forks source link

what is the nn.Parameter v for? #7

Closed liuyijiang1994 closed 6 years ago

liuyijiang1994 commented 6 years ago

hi, I find that in your code there is a parameter v in the model Attention , I wonder what this is for, thanks!

pskrunner14 commented 6 years ago

@liuyijiang1994 I think v is essentially the applied attention weights that assign a score to attention energies to help the decoder understand the context and output the most relevant information/tokens.

You might want to give this paper a read, it explains attention for seq2seq autoencoders quite well :)

liuyijiang1994 commented 6 years ago

@liuyijiang1994 I think v is essentially the applied attention weights that assign a score to attention energies to help the decoder understand the context and output the most relevant information/tokens.

You might want to give this paper a read, it explains attention for seq2seq autoencoders quite well :)

much thanks!