harvardnlp / var-attn

Latent Alignment and Variational Attention
https://arxiv.org/abs/1807.03756
MIT License
326 stars 59 forks source link

Relaxed Alignments? #4

Open fermat97 opened 5 years ago

fermat97 commented 5 years ago

Is Algorithm 2: relaxed alignment included in this repository, since I could only see examples for categorical case. thank you.

da03 commented 5 years ago

Nope, it's not included here. Relaxed alignment underperforms baseline soft attention (see Table 1 in our paper), and since we need to pretrain with E_p \log p and then finetune to get sensible results, it seems hacky to me. However, if you were interested, it is straightforward to implement using PyTorch's builtin Dirichlet rsample function: https://pytorch.org/docs/stable/_modules/torch/distributions/dirichlet.html#Dirichlet.rsample.