longy2k / obsidian-bmo-chatbot

Generate and brainstorm ideas while creating your notes using Large Language Models (LLMs) from Ollama, LM Studio, Anthropic, Google Gemini, Mistral AI, OpenAI, and more for Obsidian.
https://ko-fi.com/longy2k
MIT License
345 stars 43 forks source link

Leading space in output is not trimmed #66

Open Propheticus opened 6 months ago

Propheticus commented 6 months ago

As the title says, when executing a piece of text as a prompt the generated output is pasted in with a leading space.

This is a small issue if the output is text, but if the output is a (markdown) table this is not rendered correctly.

Obsidian_BMO

EDIT: After some investigation I found it's actually the LLM inference solution I'm using that's starting its responses with a space. Still might be a good idea to trim leading spaces. Especially when output is used in titles which becomes file names.

Propheticus commented 6 months ago

Just tested the chat and this also contain the leading space. It's not only the "prompt select generate".

Propheticus commented 6 months ago

this also seems to be causing https://github.com/longy2k/obsidian-bmo-chatbot/issues/67

Propheticus commented 6 months ago

Caused by https://github.com/janhq/jan/issues/2548

longy2k commented 4 months ago

v2.1.0

The editor and chat responses are now trimmed.

Let me know if you still run into this issue, thanks!