Morizeyao / GPT2-Chinese

Chinese version of GPT2 training code, using BERT tokenizer.
MIT License
7.48k stars 1.7k forks source link

请问GPT2-Chinese 参数量有多大呀,跟原版gpt2参数量一样吗? #288

Open ljqiang17 opened 1 year ago