Traceback (most recent call last):
File "/home/pluskars/alpaca_lora_4bit/finetune.py", line 203, in <module>
trainer.train()
File "/home/pluskars/.local/lib/python3.10/site-packages/transformers/trainer.py", line 1555, in train
return inner_training_loop(
File "/home/pluskars/.local/lib/python3.10/site-packages/transformers/trainer.py", line 1929, in _inner_training_loop
self._maybe_log_save_evaluate(tr_loss, model, trial, epoch, ignore_keys_for_eval)
File "/home/pluskars/.local/lib/python3.10/site-packages/transformers/trainer.py", line 2267, in _maybe_log_save_evaluate
self._save_checkpoint(model, trial, metrics=metrics)
File "/home/pluskars/.local/lib/python3.10/site-packages/transformers/trainer.py", line 2324, in _save_checkpoint
self.save_model(output_dir, _internal_call=True)
File "/home/pluskars/.local/lib/python3.10/site-packages/transformers/trainer.py", line 2807, in save_model
self._save(output_dir)
File "/home/pluskars/.local/lib/python3.10/site-packages/transformers/trainer.py", line 2865, in _save
self.model.save_pretrained(
File "/home/pluskars/.local/lib/python3.10/site-packages/peft/peft_model.py", line 162, in save_pretrained
self.create_or_update_model_card(save_directory)
File "/home/pluskars/.local/lib/python3.10/site-packages/peft/peft_model.py", line 637, in create_or_update_model_card
quantization_config = self.config.quantization_config.to_dict()
I am getting an error:
I am running with these settings:
Not sure what tod do with this.