AkiraTOSEI / ML_papers

ML_paper_summary(in Japanese)
5 stars 1 forks source link

Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer #97

Open AkiraTOSEI opened 3 years ago

AkiraTOSEI commented 3 years ago

TL;DR

The authors propose a model named T5 for classification/translation/question answering/summarization and large scale English dataset C4. And comprehensively investigates pre-training strategies; the format of the mask, etc. It got SOTA for various tasks. image

Why it matters:

Paper URL

https://arxiv.org/abs/1910.10683

Submission Dates(yyyy/mm/dd)

Authors and institutions

Methods

Results

Comments