Open hardliner66 opened 2 months ago
I agree. I run my own AI inference servers (Several LLMs, Two stable diffusions, a TTS with SR, a voice-clone, a music-gen) specifically to avoid sending data into the clutches of Sam Altman and other egomaniacs.
I would like to be able to use my own inference servers.
Even the AI Horde would be preferable to OAI.
It would be nice if we could configure the base url, then people could use offline models via ollama or similar tools.