BAAI-DCAI / Bunny

A family of lightweight multimodal models.
Apache License 2.0
866 stars 65 forks source link

Missing configuration file when loading merged weights #26

Closed jiayuww closed 5 months ago

jiayuww commented 5 months ago

Thanks for opensource your great work! I'm interested in experimenting with other backbone models and vision encoders. However, I encountered an issue when attempting to load merged weights from a locally saved path. I received the following error:

requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/BoyaWu10/bunny-phi-2-eva-lora/resolve/main/configuration_phi.py.

I tried loading from Hugging Face using the following code:

model = AutoModelForCausalLM.from_pretrained(
    '/path/to/local/weights',
    torch_dtype=torch.float16,
    device_map='auto',
    trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(
    '/path/to/local/weights',
    trust_remote_code=True)

Any suggestions?

Isaachhh commented 5 months ago

The snippet in Quickstart is only used for Bunny-v1.0-3B (SigLIP + Phi-2). We combine some configuration code into a single file for users' convenience. Also, you can check modeling_bunny_phi.py and configuration_bunny_phi.py and their related parts in the source code of Bunny to see the difference.

For other models, we currently only support loading them with installing source code of Bunny.

Please refer to cli-inference. You can check bunny/serve/cli.py to see how we load the model generally.

jiayuww commented 5 months ago

Thanks!