AkiraTOSEI / ML_papers

ML_paper_summary(in Japanese)
5 stars 1 forks source link

Big Self-Supervised Models are Strong Semi-Supervised Learners #88

Open AkiraTOSEI opened 3 years ago

AkiraTOSEI commented 3 years ago

TL;DR

They proposed SimCLRv2, which uses only a small number of labels and performs as well or better than supervised learning. It consists of three stages: unsupervised learning, FineTune, and self-training distillation using unlabeled data. Basically, larger models are better. image image

Why it matters:

Paper URL

https://arxiv.org/abs/2006.10029

Submission Dates(yyyy/mm/dd)

Authors and institutions

Methods

Results

Comments