longy2k / obsidian-bmo-chatbot

Generate and brainstorm ideas while creating your notes using Large Language Models (LLMs) from Ollama, LM Studio, Anthropic, OpenAI, Mistral AI, and more for Obsidian.
https://ko-fi.com/longy2k
MIT License
261 stars 32 forks source link

Messages end up in wrong order, sometimes disappear, "delete" deleting the wrong one #71

Open Madd0g opened 2 months ago

Madd0g commented 2 months ago

I'm playing locally with llama3 that recently came out, it has this annoying quirk of responding with nothing.

I think in those cases, the plugin gets confused a little, because I often get the bots response above mine, and once that happens, a lot more things break, like delete deleting the wrong chat-pair, my message disappearing, while the bot message remains, really annoying side effects

maybe it puts the bot message in the wrong index first and upon first real token "fixes" it somehow?

These empty responses are VERY tolerable because of the new edit feature (it's so great!), but when it puts the response in the wrong order, it breaks a lot of functionality and I couldn't find any workarounds

I found myself waving the cursor over 3 consecutive empty bot messages, looking for the delete button, took me a while to realize that's only on the user message that's no longer visible...

twalderman commented 2 months ago

I've seen this many times on various models. Since 2.x

longy2k commented 2 months ago

Hey,

Sorry I have not been able to work on this project for over a month now. I have been busy with school and work.

For this particular issue, I have also noticed that my conversations tend to break the UI at any given moment.

In the meantime, my quick fix is to click on the 'BMO Chatbot' button in the ribbon bar. This will keep your current conversation but it will also rebuild the window.

I will be back working on BMO in a few weeks.

Thanks

longy2k commented 1 month ago

I think one of the issue is that the response of the models in not trimmed.

Another issue is that using commands (e.g. /save, /model, and /append) broke the UI at times.

I resolved both of these issues in v2.1.0.

Let me know if you run into the same issue of messages ending up in the wrong order.

Madd0g commented 1 month ago

with the new version I don't see (and unable to choose) any of the ollama models

longy2k commented 1 month ago

@Madd0g

Please update to v2.1.1 and make sure you bypass the CORS policy for any Ollama servers connecting to Obsidian. You can follow these instructions: https://github.com/longy2k/obsidian-bmo-chatbot/wiki/How-to-setup-with-Ollama

Let me know if you got it working, thanks!

Madd0g commented 1 month ago

that's weird. I see the models, getting this now: Error fetching chat response from Ollama: ResponseError: invalid options: min_p

twalderman commented 1 month ago

Min_p is not yet supported by ollama. It will be soon. It is supported by llamacpp. Leave the field empty and ignore any errors.

Madd0g commented 1 month ago

Leave the field empty and ignore any errors.

even touching the field sets a 0.0 value

when I deleted it from the settings completely - it still sends it and still errors

image

twalderman commented 1 month ago
Madd0g commented 1 month ago

It should not affect responses though

I don't get any response in the UI - I see the request failing with status 400

twalderman commented 1 month ago

Please try making a fresh vault and see if it working as expected. Also try 111434 first to verify.

longy2k commented 1 month ago

@Madd0g

Try updating Ollama to the latest version.

Another user faced the same issue (https://github.com/longy2k/obsidian-bmo-chatbot/issues/62#issuecomment-2132427088)

Madd0g commented 1 month ago

I wish there was a command to update via cli. I never log in physically to that computer and that's why I'm always behind.

I updated ollama and it works now. Thank you, happy to have it working again!