huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
128.86k stars 25.56k forks source link

add non auto regressive model? #13488

Open nlpcat opened 2 years ago

nlpcat commented 2 years ago

This research area is developing very fast in the past two years, which could improve latency significantly with quality on par. Will huggingface team be interested into the implementation of it? Thanks.

NielsRogge commented 2 years ago

Can you point to certain papers, perhaps with corresponding code + weights?

nlpcat commented 2 years ago

This paper announced to have quality on par with 10x speed up. https://github.com/FLC777/GLAT

Another similar nat paper https://github.com/tencent-ailab/ICML21_OAXE

Most nat models are based on fairseq implementations. https://github.com/pytorch/fairseq/blob/master/fairseq/models/nat/nonautoregressive_transformer.py https://github.com/pytorch/fairseq/blob/master/fairseq/models/nat/levenshtein_transformer.py