Morizeyao / GPT2-Chinese

Chinese version of GPT2 training code, using BERT tokenizer.
MIT License
7.46k stars 1.7k forks source link

同样的代码,在win笔记本可以train,在linux台式机下显示killed #195

Closed libralibra closed 3 years ago

libralibra commented 3 years ago

代码完全一样,复制到google colab运行一会儿之后,不显示loss,然后出现一个^C就结束了。 我就把代码复制到linux台式机上去跑,结果能看到2行step和loss,然后就直接显示Killed,可能是什么原因?

tttttttttttttttt

JHR0717 commented 3 years ago

可能是内存不够了