LocalAI (https://github.com/go-skynet/LocalAI/) is a drop-in replacement for ChatGPT API allowing to serve a wide range of other models to be able to use it with ChatGPT libraries and clients.
this seems a great approach to add support for multiple LLM providers.
any help/head start that would put me in the right direction to be able to work on PR would be great :)
LocalAI (https://github.com/go-skynet/LocalAI/) is a drop-in replacement for ChatGPT API allowing to serve a wide range of other models to be able to use it with ChatGPT libraries and clients.
OpenAI library allows you to override the API URL using an environment variable both in their Python and their Node version (https://github.com/go-skynet/LocalAI#clients).
It would be great if this could also be configurable here.
Ofc, this implies that no API key would be needed later for its use.