taichimaeda / markpilot

AI-powered inline completions and chat view for Obsidian
MIT License
5 stars 1 forks source link

[Bug] Unable to connect to ollama. #6

Closed wwjCMP closed 2 weeks ago

wwjCMP commented 2 months ago

Expected Behavior

Actual Behavior

I used http://192.168.101.19:11434, but the connection failed.

Steps to Reproduce the Problem

Specifications

taichimaeda commented 2 months ago

Thanks for raising this.

I cannot reproduce this in my environment, so do you mind opening the devtools (Ctrl+Shift+i) and take a screenshot of whatever the error message you see in the "Console" tab?

wwjCMP commented 2 months ago

image

taichimaeda commented 1 month ago

This is just my guess but 404 seems to suggest that the model is not available?

Make sure you have pulled the model in the terminal:

$ ollama pull --model llama2
$ ollama serve
wwjCMP commented 1 month ago

The models I installed are not the standard versions, could this be the problem?

taichimaeda commented 1 month ago

Thant's likely the case, do you mind letting me know the model and its version you're using so I can have a look?

wwjCMP commented 1 month ago

image

taichimaeda commented 1 month ago

Makes sense. Since the current version of the plugin does not specify the tag, it seems to fallback to latest.

I've just pushed 48c9f28 which includes a setting to specify the model tag for Ollama. I'll release a new version after a bit more testing, hopefully this should fix it!