meta-llama / codellama

Inference code for CodeLlama models
Other
16.06k stars 1.88k forks source link

Is the vllm integrated into this repo? #243

Closed jeremy-chy closed 1 month ago

jeremy-chy commented 2 months ago

Hi, i want to ask that is the vLLM integrated into the inference script of this repo? or not

if not, may i ask how can we run code-llama with vLLM?

Thanks a lot!!