issues
search
richardbaihe
/
paperreading
NLP papers
MIT License
2
stars
0
forks
source link
Arxiv2019|BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
#9
Closed
richardbaihe
closed
4 years ago
richardbaihe
commented
4 years ago
Facebook AI.
https://arxiv.org/pdf/1910.13461.pdf
Facebook AI. https://arxiv.org/pdf/1910.13461.pdf