Closed iplayfast closed 7 months ago
Found the secret sauce more config.yaml
llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: ollama
llm_model: mixtral
Please update the readme to show how to use models that litellm doesn't know about.
I think the config documentation could be better too. It seems the maintainers are open to useful PRs, so feel free to create one.
I updated the readme. Thanks for reporting!
A strange one. I'm using ollama as a local model. If config.yaml is:
rawdog works prefectly. but when I update config.yaml to:
so it uses the mixtral model, now rawdog always gives back an error
This error seems to original from litellm https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json where indeed mixtral isn't listed, and mistral is.
Is it possible to get around this?