albertan017 / LLM4Decompile

Reverse Engineering: Decompiling Binary Code with Large Language Models
https://arxiv.org/abs/2403.05286
MIT License
3.2k stars 233 forks source link

Error while deserializing header: MetadataIncompleteBuffer #35

Open blacksunfm opened 2 days ago

blacksunfm commented 2 days ago

I am trying to evaluate llm4decompile-6.7b-v1.5 using the methods you provided. The model weights were downloaded from the Hugging Face repository of the same name. However, I keep encountering an error indicating that the weight files are incorrect. Below is the error message:

(llm4decompile) root@autodl-container-b52c468700-a1cda26e:~/LLM4Decompile# python ./evaluation/run_evaluation_llm4decompile_singleGPU.py Traceback (most recent call last): File "/root/LLM4Decompile/./evaluation/run_evaluation_llm4decompile_singleGPU.py", line 75, in model = AutoModelForCausalLM.from_pretrained(args.model_path,torch_dtype=torch.bfloat16).cuda() File "/root/miniconda3/envs/llm4decompile/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( File "/root/miniconda3/envs/llm4decompile/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3994, in from_pretrained with safe_open(resolved_archive_file, framework="pt") as f: safetensors_rust.SafetensorError: Error while deserializing header: MetadataIncompleteBuffer

Could you help me understand why this error occurs and how to fix it? Thank you!

albertan017 commented 1 day ago

Please use the vllm script

Other scripts have not been updated.

Regarding your error, I believe it is associated with the environment rather than the model. You might need to verify the version of the transformers and consider setting trust_remote_code=True.