johnsmith0031 / alpaca_lora_4bit

MIT License
533 stars 84 forks source link

AttributeError: 'dict' object has no attribute 'to_dict' #153

Closed kkaarrss closed 11 months ago

kkaarrss commented 11 months ago

I am getting an error:

Traceback (most recent call last):
  File "/home/pluskars/alpaca_lora_4bit/finetune.py", line 203, in <module>
    trainer.train()
  File "/home/pluskars/.local/lib/python3.10/site-packages/transformers/trainer.py", line 1555, in train
    return inner_training_loop(
  File "/home/pluskars/.local/lib/python3.10/site-packages/transformers/trainer.py", line 1929, in _inner_training_loop
    self._maybe_log_save_evaluate(tr_loss, model, trial, epoch, ignore_keys_for_eval)
  File "/home/pluskars/.local/lib/python3.10/site-packages/transformers/trainer.py", line 2267, in _maybe_log_save_evaluate
    self._save_checkpoint(model, trial, metrics=metrics)
  File "/home/pluskars/.local/lib/python3.10/site-packages/transformers/trainer.py", line 2324, in _save_checkpoint
    self.save_model(output_dir, _internal_call=True)
  File "/home/pluskars/.local/lib/python3.10/site-packages/transformers/trainer.py", line 2807, in save_model
    self._save(output_dir)
  File "/home/pluskars/.local/lib/python3.10/site-packages/transformers/trainer.py", line 2865, in _save
    self.model.save_pretrained(
  File "/home/pluskars/.local/lib/python3.10/site-packages/peft/peft_model.py", line 162, in save_pretrained
    self.create_or_update_model_card(save_directory)
  File "/home/pluskars/.local/lib/python3.10/site-packages/peft/peft_model.py", line 637, in create_or_update_model_card
    quantization_config = self.config.quantization_config.to_dict()

I am running with these settings:

 python3 finetune.py ./data.txt \
    --ds_type=txt \
    --lora_out_dir=./test/ \
    --llama_q4_config_dir=./Llama-2-7b-Chat-GPTQ/ \
    --llama_q4_model=./Llama-2-7b-Chat-GPTQ/ \
    --mbatch_size=1 \
    --batch_size=1 \
    --epochs=3 \
    --lr=3e-4 \
    --cutoff_len=256 \
    --lora_r=8 \
    --lora_alpha=16 \
    --lora_dropout=0.05 \
    --warmup_steps=5 \
    --save_steps=50 \
    --save_total_limit=3 \
    --logging_steps=5 \
    --groupsize=128 \
    --backend=cuda \
    --local_rank=1 \
    --val_set_size=0

Not sure what tod do with this.

kkaarrss commented 11 months ago

It seems a problem with Peft or the config.json. I edited the line and took away .to_dict() and then it works.