bash scripts/merge_lora.sh
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Loading checkpoint shards: 100%|███| 2/2 [01:02<00:00, 31.25s/it]
Traceback (most recent call last):
File "/home/ubuntu/SB_Tushar/lora_phi3/Phi3-Vision-ft/src/merge_lora_weights.py", line 31, in
merge_lora(args)
File "/home/ubuntu/SB_Tushar/lora_phi3/Phi3-Vision-ft/src/merge_lora_weights.py", line 12, in merge_lora
accel.save(model, args.save_model_path, max_shard_size = '5GB')
TypeError: Accelerator.save() got an unexpected keyword argument 'max_shard_size'
@tusharraskar I've updated the code with accel.save_model, and it wasn't updated proprely.
Sorry for the inconvinience. You could change the function as I mentioned.
bash scripts/merge_lora.sh Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. Loading checkpoint shards: 100%|███| 2/2 [01:02<00:00, 31.25s/it] Traceback (most recent call last): File "/home/ubuntu/SB_Tushar/lora_phi3/Phi3-Vision-ft/src/merge_lora_weights.py", line 31, in
merge_lora(args)
File "/home/ubuntu/SB_Tushar/lora_phi3/Phi3-Vision-ft/src/merge_lora_weights.py", line 12, in merge_lora
accel.save(model, args.save_model_path, max_shard_size = '5GB')
TypeError: Accelerator.save() got an unexpected keyword argument 'max_shard_size'