/root/anaconda3/envs/new_llm/lib/python3.10/site-packages/accelerate/accelerator.py:432: FutureWarning: Passing the following arguments to `Accelerator` is deprecated and will be removed in version 1.0 of Accelerate: dict_keys(['dispatch_batches', 'split_batches', 'even_batches', 'use_seedable_sampler']). Please pass an `accelerate.DataLoaderConfiguration` instead:
dataloader_config = DataLoaderConfiguration(dispatch_batches=None, split_batches=False, even_batches=True, use_seedable_sampler=True)
warnings.warn(
Traceback (most recent call last):
File "/root/paddlejob/workspace/20240315/0_llm/new_llm/LLaMA-Factory-main/src/train_bash.py", line 14, in <module>
main()
File "/root/paddlejob/workspace/20240315/0_llm/new_llm/LLaMA-Factory-main/src/train_bash.py", line 5, in main
run_exp()
File "/root/paddlejob/workspace/20240315/0_llm/new_llm/LLaMA-Factory-main/src/llmtuner/train/tuner.py", line 32, in run_exp
run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
File "/root/paddlejob/workspace/20240315/0_llm/new_llm/LLaMA-Factory-main/src/llmtuner/train/sft/workflow.py", line 54, in run_sft
trainer = CustomSeq2SeqTrainer(
File "/root/anaconda3/envs/new_llm/lib/python3.10/site-packages/transformers/trainer_seq2seq.py", line 56, in __init__
super().__init__(
File "/root/anaconda3/envs/new_llm/lib/python3.10/site-packages/transformers/trainer.py", line 527, in __init__
raise RuntimeError(
RuntimeError: Passing `optimizers` is not allowed if Deepspeed or PyTorch FSDP is enabled. You should subclass `Trainer` and override the `create_optimizer_and_scheduler` method.
Error information
deepspeed zero3 config