RosettaTechnologies / AnkiBrain

147 stars 20 forks source link

Any planned integration for gpt4all api support if someone wants to use their own local LLM model? #3

Open DrewThomasson opened 8 months ago

DrewThomasson commented 8 months ago

Just wondering or if you could point me to the code files to edit that deal with the openai api that works too lol

eshahrabani commented 8 months ago

Yes, all the LLM interface code is in the ChatAI submodule. It is a planned feature although lower priority compared to fixing clozes and other bug fixes.

hamzakat commented 5 months ago

+1 I think it's possible since the project uses langchain. Integrating it with Ollama would be appreciated too!

CPritch commented 3 months ago

Here's my PR for adding Ollama support