openai / gpt-2

Code for the paper "Language Models are Unsupervised Multitask Learners"
https://openai.com/blog/better-language-models/
Other
22.57k stars 5.53k forks source link

training from scratch #201

Open leejason opened 5 years ago

leejason commented 5 years ago

Will the code for training from scratch be released after the 1.5B model?

senorblasto commented 5 years ago

Ask the 1.5B model, but I heard it knows for sure.

zjm008 commented 5 years ago

Will the code for training from scratch be released after the 1.5B model?

Here's a way to train or finetune. https://github.com/nshepperd/gpt-2

mikolasan commented 4 years ago

I found more info in an old and already closed issue - https://github.com/openai/gpt-2/issues/19. I believe this link here will be helpful for newcomers (like myself). Ideally this repository needs a wiki.