yaoxingcheng / TLM

ICML'2022: NLP From Scratch Without Large-Scale Pretraining: A Simple and Efficient Framework
MIT License
257 stars 21 forks source link

Without Large-Scale Pretraining? Thank you! #21

Closed guotong1988 closed 1 year ago

guotong1988 commented 1 year ago
image

LM objective is here.

What do you mean by "without large-scale pretraining"?

@yaoxingcheng Thank you very much.

guotong1988 commented 1 year ago

Without Large-Scale Pretraining,

Not Without Pretraining.