longy2k / obsidian-bmo-chatbot

Generate and brainstorm ideas while creating your notes using Large Language Models (LLMs) from Ollama, LM Studio, Anthropic, Google Gemini, Mistral AI, OpenAI, and more for Obsidian.
https://ko-fi.com/longy2k
MIT License
380 stars 47 forks source link

BMO gets http 400 #105

Open Frill7742 opened 3 months ago

Frill7742 commented 3 months ago

Hi,

Before I start a huge thanks for the project.


Right now I expierence some problems with BMO. If I start my laptop and try to run a BMO chat I get the a http 400 error. I can chat on Ollama with any model normaly over the terminal and with the open-web interface (hosted through a local docker container - host network). Sometimes it works and than after an restart I get again the 400.


Log/Screenshots:

image image image image


Config:

Obsidian BMO:

image

Open-WebUI (to show that this works):

image

Ollama service file

image


I hope somebody can help me solve this issues, since I integrated BMO deep into my learning workflow and it would be very nice to continue working with this flow. If any logs or inforamtion is further needed, please let me know.

Frill7742 commented 3 months ago
FairyTail2000 commented 1 week ago

I think I found the issue: local images. I have attached a zip which reproduces the bug bug.zip

The json send by bmo look like this:

{
    "keep_alive": null,
    "messages": [{
            "content": "You are a helpful assistant.\n\n[truncated current note ref]",
            "role": "system"
        }, {
            "content": "So what additional features are in this firmware?",
            "images": [
            "<censored>\\Obsidian\\Default\\Clippings\\bug\\53310afe59ca6a13ef0dac383de25d2a_MD5.png",
            "<censored>\\Obsidian\\Default\\Clippings\\bug\\9057e08be45e87d6d58eb80aaec2f5a3_MD5 1.png",
            "<censored>\\Obsidian\\Default\\Clippings\\bug\\abbfbfcc74200b978f40b1ef942d5a84_MD5.png",
            "<censored>\\Obsidian\\Default\\Clippings\\bug\\61d0213862a30b4f3f2b53b3592e94f4_MD5 1.png",
            "<censored>\\Obsidian\\Default\\Clippings\\bug\\560b7d3aa85e1f725f0fce768b57e9de_MD5.png",
            "<censored>\\Obsidian\\Default\\Clippings\\bug\\565d3cb918b8016140ae898e7a51ccd2_MD5.png"],
            "role": "user"
        }
    ],
    "model": "llama3.2:latest",
    "options": {
        "min_p": 0,
        "mirostat": 0,
        "mirostat_eta": 0.1,
        "mirostat_tau": 5,
        "num_ctx": 8096,
        "num_gqa": null,
        "num_predict": -1,
        "num_thread": null,
        "repeat_last_n": 64,
        "repeat_penalty": 1.1,
        "seed": null,
        "stop": null,
        "temperature": 1,
        "tfs_z": 1,
        "top_k": 40,
        "top_p": 0.9
    },
    "stream": true
}

The response from ollama looks like this:

{"error":"illegal base64 data at input byte 1"}

I think ollama expects a base64 encoded representation for the images, but llama3.2:latest is not capable of processing images. So sending them here does not really help with this. This may can also be raised as an issue by ollama, since the model is not capable of vision, the images element should be ignored

BMO Version: 2.3.3 Obsidian: 1.7.7 Ollama: 0.4.2 Windows 11

philip-iii commented 1 week ago

I get a similar problem, e.g. with groq:

{
    "error": {
        "message": "'messages.1' : for 'role:user' the following must be satisfied[('messages.1' : property 'images' is unsupported, did you mean 'name'?)]",
        "type": "invalid_request_error"
    }
}

The images property is always included, even when it is mostly an empty array (coming from the message history), e.g.:

Screenshot 2024-11-20 at 11 45 50

Ollama and OpenRouter accept the images property, but others do not. If images are to be supported, this should be an optional toggle and it shall not interfere with models / APIs that do not support it. As far as I can tell, even models / APIs that do support it, use different ways of handling the images.

philip-iii commented 1 week ago

Indeed, adding the following before the fetch (both streaming and non-streaming version) seems to fix the issue:

    messageHistoryAtIndex.forEach(function(v){ delete v.images });