harvardnlp / seq2seq-attn

Sequence-to-sequence model with LSTM encoder/decoders and attention
http://nlp.seas.harvard.edu/code
MIT License
1.26k stars 278 forks source link

Visualize attention weights #58

Closed fbrad closed 7 years ago

fbrad commented 8 years ago

Hi! I was wondering if you guys plan on adding support for easy retrieval of the attention weights during decoding? This would be really helpful for qualitative analysis.

Thanks!

srush commented 8 years ago

Yes, we are working on this. We will try to get it in at least a branch soon-ish.

sebastianGehrmann commented 7 years ago

Just pushed a file to the branch attention_retrieval

yoonkim commented 7 years ago

thanks sebastian! closing