openai / gpt-2

Code for the paper "Language Models are Unsupervised Multitask Learners"
https://openai.com/blog/better-language-models/
Other
22.57k stars 5.53k forks source link

to EXTRA LARGE I only could download with this parameter: **1558M** #298

Open Quirinus-Kira opened 3 years ago

Quirinus-Kira commented 3 years ago

to EXTRA LARGE I only could download with this parameter: 1558M

_$ python downloadmodel.py 1558M EXTRA LARGE 1542M according to https://github.com/openai/gpt-2-output-dataset 1.5BM according to https://openai.com/blog/gpt-2-1-5b-release/

Originally posted by @maigva in https://github.com/openai/gpt-2/issues/209#issuecomment-899476044

preccrep commented 1 year ago

Really help me! I'm just wondering how to download the 1.5B version.