Open DrewThomasson opened 8 months ago
Yes, all the LLM interface code is in the ChatAI submodule. It is a planned feature although lower priority compared to fixing clozes and other bug fixes.
+1 I think it's possible since the project uses langchain. Integrating it with Ollama would be appreciated too!
Here's my PR for adding Ollama support
Just wondering or if you could point me to the code files to edit that deal with the openai api that works too lol