Open S1M0N38 opened 9 months ago
Here is the PR: #59
Hi @S1M0N38 do you use LiteLLM Proxy ? Can we hop on a call to learn how we can make litellm better for you?
Link to my cal for your convenience: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat?month=2024-02
@S1M0N38 do you think a change like this would allow integration with something that runs locally like LM Studio? That application allows you to interact with the model via a port on localhost. Would love to try a totally offline and free AI coding experience. https://lmstudio.ai/
The ability to set a
base_url
facilitates the use of this plugin with any OpenAI-like API.For instance, if we set
base_url
to the address of LiteLLM proxy, it would become possible to use this plugin with a multitude of different LLMs (including open-source local models with Ollama).Naturally, to prevent breaking changes, the default value for such a setting should be
"https://api.openai.com/v1"
.In my opinion, this setting should be incorporated in the
open_ai
settings. This is because, similar toapi_key
, it's a parameter found in OpenAI’s official Python API: