CASIA-IVA-Lab / AnomalyGPT

[AAAI 2024 Oral] AnomalyGPT: Detecting Industrial Anomalies Using Large Vision-Language Models
https://anomalygpt.github.io
Other
807 stars 100 forks source link

The error when conbine the LLaMA and Delta weights #64

Open lalisa2020 opened 9 months ago

lalisa2020 commented 9 months ago

When I run the command: python -m fastchat.model.apply_delta --base {path_to_llama_weights} --target ./vicuna_ckpt/7b_v0/ --delta {path_to_delta_vicuna_weights}

I got this error: Unable to load weights from pytorch checkpoint file for './vicuna-7b/pytorch_model-00001-of-00002.bin' at './vicuna-7b/pytorch_model-00001-of-00002.bin'. If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True.

lalisa2020 commented 9 months ago

Because I can not run this command: pip install git+https://github.com/lm-sys/FastChat.git@v0.1.10

So I used this command: pip3 install fschat

Is this the cause of the problem?

zhtstar commented 9 months ago

Maybe it is caused by the versions of fschat and transformers you are using. This page introduces the corresponding relationship between model weights and the versions of fschat and transformers. I hope it can help you. https://github.com/lm-sys/FastChat/blob/main/docs/vicuna_weights_version.md#how-to-apply-delta-weights-for-weights-v11-and-v0

Here are my settings, no errors reported: vicuna-7b-delta-v1.1 llama-7b-hf fschat version is 0.2.1 transformers version is 4.37.0.dev0

Also, if you cannot run pip install git+https://github.com/lm-sys/FastChat.git@v0.1.10, you can pip install fschat==0.1.10

lalisa2020 commented 9 months ago

Maybe it is caused by the versions of fschat and transformers you are using. This page introduces the corresponding relationship between model weights and the versions of fschat and transformers. I hope it can help you. https://github.com/lm-sys/FastChat/blob/main/docs/vicuna_weights_version.md#how-to-apply-delta-weights-for-weights-v11-and-v0

Here are my settings, no errors reported: vicuna-7b-delta-v1.1 llama-7b-hf fschat version is 0.2.1 transformers version is 4.37.0.dev0

Also, if you cannot run pip install git+https://github.com/lm-sys/FastChat.git@v0.1.10, you can pip install fschat==0.1.10

Thanks!!

FantasticGNU commented 9 months ago

@zhtstar Thank you for your answer! I will keep this issue Open for others to view.