datawhalechina / self-llm

《开源大模型食用指南》基于Linux环境快速部署开源大模型,更适合中国宝宝的部署教程
Apache License 2.0
7.69k stars 932 forks source link

Qwen2 lora微调问题 #209

Open zbbb31 opened 1 month ago

zbbb31 commented 1 month ago

lora显示这个错误--------------------------------------------------------------------------- TypeError Traceback (most recent call last) Cell In[49], line 1 ----> 1 trainer = Trainer( 2 model=model, 3 args=args, 4 train_dataset=tokenized_id, 5 data_collator=DataCollatorForSeq2Seq(tokenizer=tokenizer, padding=True), 6 )

File ~/miniconda3/lib/python3.10/site-packages/transformers/trainer.py:402, in Trainer.init(self, model, args, data_collator, train_dataset, eval_dataset, tokenizer, model_init, compute_metrics, callbacks, optimizers, preprocess_logits_for_metrics) 399 self.deepspeed = None 400 self.is_in_train = False --> 402 self.create_accelerator_and_postprocess() 404 # memory metrics - must set up as early as possible 405 self._memory_tracker = TrainerMemoryTracker(self.args.skip_memory_metrics)

File ~/miniconda3/lib/python3.10/site-packages/transformers/trainer.py:4535, in Trainer.create_accelerator_and_postprocess(self) 4532 args.update(accelerator_config) 4534 # create accelerator object -> 4535 self.accelerator = Accelerator(**args) 4536 # some Trainer classes need to use gather instead of gather_for_metrics, thus we store a flag 4537 self.gather_function = self.accelerator.gather_for_metrics

TypeError: Accelerator.init() got an unexpected keyword argument 'use_seedable_sampler',如何解决呢?

KMnO4-zx commented 1 month ago

尝试升级到最新的transformers

RuiDW commented 1 month ago

这个问题有可能是因为 accelerate 和 transformers 库安装的版本不兼容,可以参考作者给的版本进行安装