Closed louis030195 closed 1 month ago
atm we only support openai or ollama
add support for custom openai url
definition of done:
You can just add a "BASE_URL" in the settings, and use litellm. This will enable you to run any model. Either remote, or local, or both @louis030195
done
atm we only support openai or ollama
add support for custom openai url
definition of done: