Open Madd0g opened 2 months ago
I've seen this many times on various models. Since 2.x
Hey,
Sorry I have not been able to work on this project for over a month now. I have been busy with school and work.
For this particular issue, I have also noticed that my conversations tend to break the UI at any given moment.
In the meantime, my quick fix is to click on the 'BMO Chatbot' button in the ribbon bar. This will keep your current conversation but it will also rebuild the window.
I will be back working on BMO in a few weeks.
Thanks
I think one of the issue is that the response of the models in not trimmed.
Another issue is that using commands (e.g. /save
, /model
, and /append
) broke the UI at times.
I resolved both of these issues in v2.1.0.
Let me know if you run into the same issue of messages ending up in the wrong order.
with the new version I don't see (and unable to choose) any of the ollama models
@Madd0g
Please update to v2.1.1 and make sure you bypass the CORS policy for any Ollama servers connecting to Obsidian. You can follow these instructions: https://github.com/longy2k/obsidian-bmo-chatbot/wiki/How-to-setup-with-Ollama
Let me know if you got it working, thanks!
that's weird. I see the models, getting this now: Error fetching chat response from Ollama: ResponseError: invalid options: min_p
Min_p is not yet supported by ollama. It will be soon. It is supported by llamacpp. Leave the field empty and ignore any errors.
Leave the field empty and ignore any errors.
even touching the field sets a 0.0 value
when I deleted it from the settings completely - it still sends it and still errors
It should not affect responses though
I don't get any response in the UI - I see the request failing with status 400
Please try making a fresh vault and see if it working as expected. Also try 111434 first to verify.
@Madd0g
Try updating Ollama to the latest version.
Another user faced the same issue (https://github.com/longy2k/obsidian-bmo-chatbot/issues/62#issuecomment-2132427088)
I wish there was a command to update via cli. I never log in physically to that computer and that's why I'm always behind.
I updated ollama and it works now. Thank you, happy to have it working again!
I'm playing locally with llama3 that recently came out, it has this annoying quirk of responding with nothing.
I think in those cases, the plugin gets confused a little, because I often get the bots response above mine, and once that happens, a lot more things break, like delete deleting the wrong chat-pair, my message disappearing, while the bot message remains, really annoying side effects
maybe it puts the bot message in the wrong index first and upon first real token "fixes" it somehow?
These empty responses are VERY tolerable because of the new edit feature (it's so great!), but when it puts the response in the wrong order, it breaks a lot of functionality and I couldn't find any workarounds
I found myself waving the cursor over 3 consecutive empty bot messages, looking for the delete button, took me a while to realize that's only on the user message that's no longer visible...