defog-ai / sql-eval

Evaluate the accuracy of LLM generated outputs
Apache License 2.0
485 stars 52 forks source link

Update run_checkpoints.sh to use the vllm api server instead of offline inference #144

Closed rishsriv closed 2 months ago

rishsriv commented 2 months ago

This is to have more consistency with results expected in a production scenario