Closed babybirdprd closed 9 months ago
Yes, I will consider adding support for more inference engines. However, the current priority is to add support for multimodal models(this will lead to truly amazing features), and I need to prioritize getting this done first.
For details, you need to read the "Models(or Services) compatible with OpenAI Interface" section in README.md.
Awesome! Will check it shortly.
Various inference services have been well supported, let's close the issue.
Ollama, Litellm, lmstudio, etc