shibing624 / MedicalGPT

MedicalGPT: Training Your Own Medical GPT Model with ChatGPT Training Pipeline. 训练医疗大模型,实现了包括增量预训练(PT)、有监督微调(SFT)、RLHF、DPO、ORPO。
Apache License 2.0
2.94k stars 452 forks source link

用bloom没问题,但是切换ChatGLM2后抛了这个异常 #307

Closed yuwangnexusera closed 5 months ago

yuwangnexusera commented 5 months ago

Traceback (most recent call last): File "/content/MedicalGPT/supervised_finetuning.py", line 1394, in main() File "/content/MedicalGPT/supervised_finetuning.py", line 1315, in main model.gradient_checkpointing_enable() File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py", line 1872, in gradient_checkpointing_enable self._set_gradient_checkpointing(enable=True, gradient_checkpointing_func=gradient_checkpointing_func) TypeError: ChatGLMPreTrainedModel._set_gradient_checkpointing() got an unexpected keyword argument 'enable'

shibing624 commented 5 months ago

glm模型不支持model.gradient_checkpointing_enable(),可以注释掉该行代码,或者更新代码。