THUDM / ChatGLM-6B

ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
Apache License 2.0
39.96k stars 5.15k forks source link

[BUG/Help] 运行train.sh时出现AttributeError: 'Seq2SeqTrainer' object has no attribute 'is_deepspeed_enabled' #1263

Open Mixide opened 1 year ago

Mixide commented 1 year ago

Is there an existing issue for this?

Current Behavior

6/20/2023 19:56:11 - WARNING - transformers_modules.chatglm-6b.modeling_chatglm - use_cache=True is incompatible with gradient checkpointing. Setting use_cache=False... 0%|▎ | 10/3000 [01:12<5:41:49, 6.86s/it]Traceback (most recent call last): File "F:\Code\python\DeepLearn\Huggingface\ChatGLM-6B\ptuning\main.py", line 431, in main() File "F:\Code\python\DeepLearn\Huggingface\ChatGLM-6B\ptuning\main.py", line 370, in main train_result = trainer.train(resume_from_checkpoint=checkpoint) File "F:\Code\python\DeepLearn\Huggingface\ChatGLM-6B\ptuning\trainer.py", line 1635, in train return inner_training_loop( File "F:\Code\python\DeepLearn\Huggingface\ChatGLM-6B\ptuning\trainer.py", line 1981, in _inner_training_loop self._maybe_log_save_evaluate(tr_loss, model, trial, epoch, ignore_keys_for_eval) File "F:\Code\python\DeepLearn\Huggingface\ChatGLM-6B\ptuning\trainer.py", line 2220, in _maybe_log_save_evaluate logs["learning_rate"] = self._get_learning_rate() File "E:\Anaconda3\envs\py310\lib\site-packages\transformers\trainer_pt_utils.py", line 841, in _get_learning_rate if self.is_deepspeed_enabled: AttributeError: 'Seq2SeqTrainer' object has no attribute 'is_deepspeed_enabled'

Expected Behavior

No response

Steps To Reproduce

conda activate py310 bash train.sh

Environment

- OS:win11
- Python:3.10
- Transformers:4.30.2
- PyTorch:2.0.0
- CUDA Support :True

Anything else?

No response

suuuch commented 1 year ago

降级 transformer 到 4.27 解决这个问题。 pip install transformers==4.27.1