manimohans / obsidian-local-llm-helper

MIT License
10 stars 3 forks source link

Memory on the same obsidian file #1

Closed joseiriarte1982 closed 4 days ago

joseiriarte1982 commented 1 month ago

A have a local LLM or better said, I am working with Llama.cpp, the thing is that I open the file I will be making my prompts, but since I am working with the server, each prompt is taken as if it were the first one. Wouldn't it be nice is you could store a localstorage or similar db from the current file conversation? This added that you could fill or write the new line with the name of the model, or whatever the LLM is chosen to be called. I don't know much about how obsidian works, I am a web developer, I was checking the main.js , and I think the code is easy to understand. So maybe I could help with your guidance. Saludos

manimohans commented 1 month ago

Good points. The problem with working with local LLMs is - the context window is pretty small, and if we keep giving conversation history as context, the response quality might take a hit. I can add something that will store only the last N conversations, for example, a small number like N==3. The other one about writing model info is doable, but we already have that information ready from the settings page.

manimohans commented 4 days ago

v1.1.3 should address this.