chhoumann / quickadd

QuickAdd for Obsidian
https://quickadd.obsidian.guide
MIT License
1.48k stars 135 forks source link

[FEATURE REQUEST] AI assistant with local LLM #623

Open aparente opened 7 months ago

aparente commented 7 months ago

Is your feature request related to a problem? Please describe. I'd like to choose to run the AI assistant with a local LLM e.g. via ollama. I know I could probably do this through an arbitrary script, but it would be nice to add a local model as an option in the AI assistant.

Describe the solution you'd like Specify path to a model while keeping system prompts and macro recipes the same.

Additional context This isn't a super fleshed out request. I know there's probably options to do this through local scripts, I'll have to dig through documentation first (though it might be a nice documentation example!). Having an option for local model specification in the AI assistant macros would be really useful.