BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
11.6k stars 1.33k forks source link

[Feature]: LM Studio support #3755

Open mrdjohnson opened 3 months ago

mrdjohnson commented 3 months ago

The Feature

To be able to use litellm with first party support for LM Studio!

Motivation, pitch

LM Studio is a very popular platform and they're the best way to get access to hugging face models locally. If possible having access to the LM Studio server through lite llm would be amazing

Twitter / LinkedIn details

No response

krrishdholakia commented 3 months ago

Hey @mrdjohnson can't you already do this via - https://docs.litellm.ai/docs/providers/openai_compatible

model_list:
  - model_name: my-model
    litellm_params:
      model: openai/<your-model-name>  # your lm studio model name
      api_base: <model-api-base>       # lm studio api base
      api_key: api-key                 # lm-studio api key if required (else put a fake key here)

unclear on what first-party support here looks like?

mrdjohnson commented 3 months ago

Thats very true, LM Studio is quite popular but its lacking in the first-party support section!

Its awesome that there is openai api compatibility but there are things the SDK gives like better insight into model generation statistics, listing all the downloaded models, loading models and the progress around loading the model, and unloading models. These are things that the open ai api does not account for.

computersrmyfriends commented 2 months ago

I second that. LM Studio is the most popular and most commonly used, probably next to Ollama for local models. It's not listed as supported with specific features related to it.

mrdjohnson commented 2 months ago

I'm working with the LM Studio team! There are a lot of things the Typescript SDK can do that the api does not provide just yet. As soon as we get the api to allow more options i'll be able to provide more reasons to first party support!

krrishdholakia commented 2 months ago

Hey @mrdjohnson thanks! happy to add native support once you think it's ready

daaain commented 1 month ago

Took me a few tries and unfriendly error messages to figure out that api_base is supposed to be http://localhost:1234/v1 so sharing here.