ml-research / LlavaGuard

Apache License 2.0
28 stars 0 forks source link

Failed to download LlavaGuard checkpoints from huggingface #2

Closed lemon-awa closed 4 months ago

lemon-awa commented 5 months ago

I have tried some ways to download LlavaGuard model:

  1. I tried to download directly from huggingface, with model_path = "AIML-TUDA/LlavaGuard-7B" and tokenizer_path = "llava-hf/llava-1.5-7b-hf" , but it met errors AIML-TUDA/LlavaGuard-7B does not appear to have a file named preprocessor_config.json. Checkout 'https://huggingface.co/AIML-TUDA/LlavaGuard-7B/tree/main' for available files.
  2. I read your repo, in the Launch LlavaGuard Server section, it said it provided three checkpoints, but I can't find code to download checkpoints in your codes.
lukashelff commented 5 months ago

SGlang should take care of downloading the model and tokenizer. Did you try launching the server using SGlang with the following command: python3 -m sglang.launch_server --model-path AIML-TUDA/LlavaGuard-7B --tokenizer-path llava-hf/llava-1.5-7b-hf --port 10000 Otherwise, you should also be able to clone the two repos and provide the local paths of the model and tokenizer when you launch the server.

lemon-awa commented 5 months ago

Is SGlang the only way to download the checkpoints? Does huggingface not work? Besides, after downloading the checkpoints of model and tokenizer through SGlang, what code do I need to load this model and tokenizer, since I meet error with both LlavaLlamaForCausalLM.from_pretrained and AutoModelForCausalLM.from_pretrained

lemon-awa commented 5 months ago
Screenshot 2024-07-03 at 7 52 27 PM

Besides, I meet errors like this when run the code python3 -m sglang.launch_server --model-path AIML-TUDA/LlavaGuard-7B --tokenizer-path llava-hf/llava-1.5-7b-hf --port 10000

lukashelff commented 4 months ago

There seems to be a problem with the dependencies. You can try out the dockerfile provided by sglang. This should work without any further installations.