MC would not connect to Ollama's localhost API endpoint so I started up liteLLM and it's working as an intermediary proxy for Ollama <-> MC. Text streams are coming in fine however there are extra formatting characters in just about every response.
How can this be parsed out?
To Reproduce
I am using Ollama with the Gemma2 9B model.
LiteLLM is the proxy for OpenAI formatted endpoint
Screenshots/GIF
see above
Desktop (please complete the following information):
Describe the bug
MC would not connect to Ollama's localhost API endpoint so I started up liteLLM and it's working as an intermediary proxy for Ollama <-> MC. Text streams are coming in fine however there are extra formatting characters in just about every response.
How can this be parsed out?
To Reproduce
I am using Ollama with the Gemma2 9B model. LiteLLM is the proxy for OpenAI formatted endpoint
Screenshots/GIF
see above
Desktop (please complete the following information):
-Windows11 -96 GB DDR5 -RTX 4090 -13900k