Closed perfectra1n closed 6 months ago
Let me know if you're okay with this @soulsands, if I don't hear from you for a while, I'll just go ahead and merge this, and create a new release for it :)
@Lolabird do you have any issues with me merging this? :)
This PR adds functionality and documentation about using Ollama instead of OpenAI's API. The user will of course need their own Ollama instance, but this allows them to keep all their LLM traffic local, and even to use their own custom built LLMs.