meta-llama / codellama

Inference code for CodeLlama models
Other
15.92k stars 1.85k forks source link

CodeLlama-34b Fine-Tune Evaluation #200

Closed sanipanwala closed 8 months ago

sanipanwala commented 8 months ago

Hello,

I have done training and saved the model, and adapter config file on the local disk.

When I load the model from the local disk again to generate the output I get the below error.

Anyone can help me with this issue?

File "PythonV2.py", line 11, in <module> model = AutoPeftModelForCausalLM.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/python3.12/site-packages/peft/auto.py", line 127, in from_pretrained return cls._target_peft_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/python3.12/site-packages/peft/peft_model.py", line 354, in from_pretrained model.load_adapter(model_id, adapter_name, is_trainable=is_trainable, **kwargs) File "/python3.12/site-packages/peft/peft_model.py", line 695, in load_adapter adapters_weights = load_peft_weights(model_id, device=torch_device, **hf_hub_download_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/python3.12/site-packages/peft/utils/save_and_load.py", line 313, in load_peft_weights adapters_weights = safe_load_file(filename, device=device) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/python3.12/site-packages/safetensors/torch.py", line 308, in load_file with safe_open(filename, framework="pt", device=device) as f: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ safetensors_rust.SafetensorError: Error while deserializing header: InvalidHeaderDeserialization

Thanks.

jgehring commented 8 months ago

Hi @sanipanwala, I'd post this question in Github issues or other forums related to the fine-tuning and/or inference library you're using (supposedly https://github.com/huggingface/transformers?). The codellama repository offers a reference implementation for inference only.