BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.69k stars 1.61k forks source link

[Bug]: silence noisy error #5683

Closed okhat closed 1 month ago

okhat commented 1 month ago

What happened?

Following up from #5597. LiteLLM is now in DSPy 2.4.16, and I hope we can fix this in time for DSPy 2.5.

That warning and BadRequest errors still pop up for our users.

For example, when using openai-compatible servers like SGLang for llama3-8b locally and setting:

model='openai/default', api_base="http://127.0.0.1:30000/v1", api_key="EMPTY"

One gets:

cost_calculator.py:834 - litellm.cost_calculator.py::response_cost_calculator
[...]
litellm.exceptions.BadRequestError: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=default

IMO, the cost calculator shouldn't be so loud / suggest to users that something is broken "error".

Relevant log output

No response

Twitter / LinkedIn details

No response

krrishdholakia commented 1 month ago

hi @okhat is this on a later version of litellm? The line you're referring to is now a debug message

Screenshot 2024-09-13 at 3 38 14 PM
krrishdholakia commented 1 month ago

this is as of v1.44.25+ - https://github.com/BerriAI/litellm/commit/0295a2256117c574284794d180c7ee15245c69c1

okhat commented 1 month ago

This is still not fixed now on the latest LiteLLM, many users are complaining @krrishdholakia .

model = "openai/meta-llama/Meta-Llama-3-8B"
api_base="http://localhost:7504/v1" # SGLang backend

image

okhat commented 1 month ago

It's probably coming from here:

https://github.com/BerriAI/litellm/blob/b2fbee3923b76f8142531c46c518c3ff1818dd59/litellm/cost_calculator.py#L627

isaacbmiller commented 1 month ago

+1 to this. This blocked me from doing some of my work today because the outputs on every DSPy call would crash my Jupyter notebook.

I ended up commenting out the part of cost_calculator that raises an error as a temp fix

krrishdholakia commented 1 month ago

Acknowledging this. Will have this fixed by tomorrow.

krrishdholakia commented 1 month ago

Fix pushed - https://github.com/BerriAI/litellm/commit/cb7a60a8fa3df561f8075088caed959a8f56ce1c

Will be live in tomorrow's release

okhat commented 1 month ago

Thank you @krrishdholakia !!