issues
search
defog-ai
/
sql-eval
Evaluate the accuracy of LLM generated outputs
Apache License 2.0
448
stars
47
forks
source link
LoRA support for vLLM
#169
Closed
wendy-aw
closed
3 weeks ago
wendy-aw
commented
3 weeks ago
LoRA adapters can now be used with vLLM on top of a base model in
api_runner.py
and
vllm_runner.py
.
Note that for now LoRA ranks must be one of (8, 16, 32, 64)
api_runner.py
andvllm_runner.py
.