KempnerInstitute / distributed-inference-vllm

Distributed Inference with vLLM
Apache License 2.0
0 stars 0 forks source link

add off-line batch inference example #9

Open dmbala opened 2 weeks ago

dmbala commented 2 weeks ago

Many users would like to run the inference as an offline batch processes.

dmbala commented 2 weeks ago

I will add this example.