Closed K-J-VV closed 6 months ago
Looks like documentation on LiteLLM may need updating? Was able to get help over at LibreChat, answer can be found here https://github.com/danny-avila/LibreChat/discussions/2215
@K-J-VV could you detail the change you applied as I am facing the same issue?
I found this change but I face the problems with every models (llama3, mistral)
@K-J-VV could you detail the change you applied as I am facing the same issue?
Just realized my comment above had incorrect hyperlink to solution, updated hope it helps!
What happened?
I am trying to use LiteLLM to proxy Ollama to LibreChat. However, when I ask anything to LiteLLM it responds but the text comes out... garbled non-sense. When I ask Ollama directly the response is great. Below are screenshots.
Response via LiteLLM, proxying Ollama
Response via Ollama, directly
Here is the content of my /app/config.yaml for LiteLLM:
Here is the content of my /app/librechat.yaml file
Relevant log output
Twitter / LinkedIn details
No response