facebookresearch / fairseq

Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
MIT License
30.53k stars 6.41k forks source link

positional attention in nonautoregressive_transformer #1380

Closed JackHorse closed 5 years ago

JackHorse commented 5 years ago

Hi,

thank you for releasing the code about nonautoregressive_transformer.

but why I can‘t find the positional attention in decoder as described in the paper Non-Autoregressive Neural Machine Translation (Gu et al., 2017) ?

MultiPath commented 5 years ago

Hi, we only implemented the simplest version of the nonautoregressive model which does not use any special module described in the original paper. I can push a new PR to implement the positional attention and fertility later.

JackHorse commented 5 years ago

All right!

Thanks for your prompt reply.

SkyAndCloud commented 4 years ago

@MultiPath Hi, I want to know when will you commit the PR?

MultiPath commented 4 years ago

@SkyAndCloud Hi, I currently did not have bandwidth to re-implement this part. Maybe in January.

wangwang110 commented 4 years ago

@SkyAndCloud Hi, I currently did not have bandwidth to re-implement this part. Maybe in January.

Any new developments?

wangwang110 commented 4 years ago

@MultiPath Any new developments for re-implement NAT?

SkyAndCloud commented 4 years ago

@wangwang110 Hi, I have not found the update since then.