Closed nextzard closed 4 months ago
I'm using Fadora 40 and installed it today.
I am using ollama model: codegemma, but at the end after answering I get this message
22:42:02 - LiteLLM:ERROR: litellm_logging.py:1266 - Model=codegemma not found in completion cost map. Setting 'response_cost' to None
If I use it with llama3, there is no error.
Maybe https://github.com/BerriAI/litellm/blob/main/litellm/model_prices_and_context_window_backup.json
json file, I wonder if the error occurs if the model is not in there. I'm using it well, thanks.
No response
fixed here https://github.com/BerriAI/litellm/pull/4424
@nextzard curious - what do you use LiteLLM for today ?
I am having this issue with phi3 and llava-phi3 using ollama.
What happened?
I'm using Fadora 40 and installed it today.
I am using ollama model: codegemma, but at the end after answering I get this message
22:42:02 - LiteLLM:ERROR: litellm_logging.py:1266 - Model=codegemma not found in completion cost map. Setting 'response_cost' to None
If I use it with llama3, there is no error.
Maybe https://github.com/BerriAI/litellm/blob/main/litellm/model_prices_and_context_window_backup.json
json file, I wonder if the error occurs if the model is not in there. I'm using it well, thanks.
Relevant log output
No response
Twitter / LinkedIn details
No response