Closed Jeffwan closed 1 week ago
https://docs.vllm.ai/en/latest/models/lora.html describe the steps to load a lora model.
python -m vllm.entrypoints.openai.api_server \ --model meta-llama/Llama-2-7b-hf \ --enable-lora \ --lora-modules sql-lora=~/.cache/huggingface/hub/models--yard1--llama-2-7b-sql-lora-test/
There're two issues
snapshots/0dfa347e8877a4d4ed19ee56c140fa518470028c
~
Path documented
Update the path with appended snapshot commit id
Update to absolute path
$HOME
I will submit a PR for short-term fix and separate PR to support ~ and dynamic loading from model registery.
📚 The doc issue
https://docs.vllm.ai/en/latest/models/lora.html describe the steps to load a lora model.
There're two issues
snapshots/0dfa347e8877a4d4ed19ee56c140fa518470028c
~
is not expanded automatically and it fails to load the model, at this moment, relative path is not supported.Screenshots
Path documented![image](https://github.com/vllm-project/vllm/assets/4739316/33675848-dbd3-43ce-b3bd-e830034e1cbb)
Update the path with appended snapshot commit id![image](https://github.com/vllm-project/vllm/assets/4739316/3624eeb7-3afb-483e-b722-8df61b8dc2d6)
Update to absolute path![image](https://github.com/vllm-project/vllm/assets/4739316/b7b8a1be-f317-46b4-86de-7a6a3dac5fea)
Suggest a potential alternative/fix
snapshots/0dfa347e8877a4d4ed19ee56c140fa518470028c
~
to$HOME