ssbuild / chatglm_finetuning

chatglm 6b finetuning and alpaca finetuning
1.54k stars 176 forks source link

不启动lora,加载28层,冻结28层,训练完1个epoch卡住 #195

Closed leoluopy closed 1 year ago

leoluopy commented 1 year ago

不启动lora,加载28层,冻结28层,训练完1个epoch卡住 global_num_layers_freeze = 28 'with_lora': False,

训练启动时显示可训练参数为: 534 M Trainable params 5.6 B Non-trainable params 6.2 B Total params

训练完1个epoch卡住不反应,等很久也不开始下一个epoch训练。

leoluopy commented 1 year ago

写文件时间,改为本地磁盘后问题缓解。