brumik / obsidian-ollama-chat

A plugin for chatting with you obsidian notes trough local Ollama LLM instead of Chat GTP.
MIT License
108 stars 6 forks source link

🦙 Obsidian Ollama Chat

This plugin allows you to ask your local LLM about your own notes.

Ollama plugin link: https://obsidian.md/plugins?id=ollama-chat

Requirements:

Indexing is slow and hard to do in JS. Therefore you will need to run a lightweight python server to do the indexing for you next to your ollama.

For more information about progress and install see: https://github.com/brumik/ollama-obsidian-indexer

The https://github.com/brumik/obsidian-ollama-chat/tree/move-llama-inhouse branch does not need the above server to run but it is slower and less developed in general. If you cannot run the python server though you might find it useful.

To install the branch you need to build the plugin with npm run build and copy it to your obsidian install manually.

Features:

Future plans:

Any feature recommendation is welcome.