KarelDO / xmc.dspy

In-Context Learning for eXtreme Multi-Label Classification (XMC) using only a handful of examples.
MIT License
339 stars 19 forks source link

vLLM Compatibility #5

Open sidjha1 opened 5 months ago

sidjha1 commented 5 months ago

Hello, I was curious whether it was possible to run models locally via vLLM. The README mentions HF TGI for running local models. Looking through the experimental dspy branch it seems that HF TGI is chosen for the model if an OpenAI model is not provided. Should I modify the experimental branch to add vLLM support or is there another way to run local models on vLLM?

KarelDO commented 5 months ago

Ideally DSPy handles all of this, and IReRa just uses whatever LLM you supply. To run with vLLM for now, it is indeed best to change how the models are created in the irera branch on DSPy. I'd need to think of a more scalable way of taking care of model providers long-term.