Closed Jacobh2 closed 4 hours ago
api_base: https://
.openai.azure.com/
it looks like you're using azure
this is because price is calculated based on the model name returned by azure (since the deployment name can be anything)
you can specify the base_model this maps to, to get the relevant price information - https://docs.litellm.ai/docs/proxy/cost_tracking#spend-tracking-for-azure-openai-models
Yes, and as you can see in my config I provided, I do provide the base model. I believe this is why I do get the proper cost in the database, without it, I would assume I would get also the wrong/no cost in the database.
Please stop closing issues until they are properly resolved @krrishdholakia 🙏 😅
The base model needs to be set in model_info
https://docs.litellm.ai/docs/proxy/cost_tracking#chat-completions--embeddings
Description
Calling the
/v1/model/info
endpoint does not include prices for models, even though the price is correctly recorded in the database.I have the following config:
and make a call
I can correctly see that the call is made and the spend is recorded in the database:
I can also see in the logs the following:
But if I make a request to the model info API endpoint, I get the following data:
where the costs are set to
null
.Expected
I would expect to get back the correct cost for this model!
Relevant log output
No response
Twitter / LinkedIn details
No response