Open g588928812 opened 1 year ago
I would love to take the time to add support for multiple backends, I think it would really help with self-hosting. :smile:
I don't really have the bandwidth for it right now but someone else could definitely give it a shot! I think the easiest way would be to write a wrapper server in it's own repository that makes calls to whatever backend we want and streams the outputs in a format compatible with ChatUI, similar to this one for using OpenAI APIs with ChatUI.
Then we could document those wrappers in the readme so users could pick and choose their backends!
Let me know if you want to try it and I can help out!
Note that oobabooga/text-generation-webui
is a Gradio app so compatibility with it would signify compatibility with any Gradio app used as an API – which would be awesome! cc @abidlabs
IMO @nsarrazin if we have a clean implem from the community we can also upstream it in this repo
It should be possible to add supoprt for multiple backends using https://github.com/BerriAI/litellm
Great interface! Are there any plans to support text-generation-webui as a backend? https://github.com/oobabooga/text-generation-webui