Open leejason opened 4 years ago
Ask the 1.5B model, but I heard it knows for sure.
Will the code for training from scratch be released after the 1.5B model?
Here's a way to train or finetune. https://github.com/nshepperd/gpt-2
I found more info in an old and already closed issue - https://github.com/openai/gpt-2/issues/19. I believe this link here will be helpful for newcomers (like myself). Ideally this repository needs a wiki.
Will the code for training from scratch be released after the 1.5B model?