BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.54k stars 1.46k forks source link

[Bug]: /spend/calculate doesn't work for Gemini models #4452

Closed emerzon closed 3 months ago

emerzon commented 3 months ago

What happened?

Querying /spend/calculate for Gemini models always returns zero

Relevant log output

curl --location 'https://llm/spend/calculate' --header 'Content-Type: application/json' --data '{"model": "gemini-1.5-pro", "messages": [{"role": "user", "content": "Hey, hows it going???"}]}'
{"cost":0.0}

Twitter / LinkedIn details

@emersongomesma

krrishdholakia commented 3 months ago

able to repro