dmlc / gluon-nlp

NLP made easy
https://nlp.gluon.ai/
Apache License 2.0
2.56k stars 538 forks source link

Add raw attention scores to the AttentionCell #951

Open emilmont opened 5 years ago

emilmont commented 5 years ago

Description

For implementing a pointer mechanism in sequence to sequence models it is very practical to re-use attention cells. For example see the Attention-Based Copy Mechanism described in Jia, Robin, and Percy Liang. "Data recombination for neural semantic parsing." arXiv preprint arXiv:1606.03622 (2016).

The proposal is to additionally return the raw attention scores in the AttentionCell:

References

Related literature:

szha commented 5 years ago

Thanks for the proposal. Adding pointer/copy mechanism is an important feature. cc @dmlc/gluon-nlp-committers

szhengac commented 5 years ago

Hi @emilmont, thanks for the proposal. Are you referring to the att_weights in the following? https://github.com/dmlc/gluon-nlp/blob/5bb1ce6c43627b93a829da7aa2bdacd7e28190a8/src/gluonnlp/model/attention_cell.py#L165

szha commented 5 years ago

I think Emilio is referring to this:

https://github.com/dmlc/gluon-nlp/blob/5bb1ce6c43627b93a829da7aa2bdacd7e28190a8/src/gluonnlp/model/attention_cell.py#L57-L58