Open emilmont opened 5 years ago
Thanks for the proposal. Adding pointer/copy mechanism is an important feature. cc @dmlc/gluon-nlp-committers
Hi @emilmont, thanks for the proposal. Are you referring to the att_weights
in the following? https://github.com/dmlc/gluon-nlp/blob/5bb1ce6c43627b93a829da7aa2bdacd7e28190a8/src/gluonnlp/model/attention_cell.py#L165
I think Emilio is referring to this:
Description
For implementing a pointer mechanism in sequence to sequence models it is very practical to re-use attention cells. For example see the Attention-Based Copy Mechanism described in Jia, Robin, and Percy Liang. "Data recombination for neural semantic parsing." arXiv preprint arXiv:1606.03622 (2016).
The proposal is to additionally return the raw attention scores in the AttentionCell:
References
Related literature: