Closed wwjCMP closed 2 weeks ago
Thanks for raising this.
I cannot reproduce this in my environment, so do you mind opening the devtools (Ctrl+Shift+i
) and take a screenshot of whatever the error message you see in the "Console" tab?
This is just my guess but 404 seems to suggest that the model is not available?
Make sure you have pulled the model in the terminal:
$ ollama pull --model llama2
$ ollama serve
The models I installed are not the standard versions, could this be the problem?
Thant's likely the case, do you mind letting me know the model and its version you're using so I can have a look?
Makes sense. Since the current version of the plugin does not specify the tag, it seems to fallback to latest
.
I've just pushed 48c9f28 which includes a setting to specify the model tag for Ollama. I'll release a new version after a bit more testing, hopefully this should fix it!
Expected Behavior
Actual Behavior
I used http://192.168.101.19:11434, but the connection failed.
Steps to Reproduce the Problem
Specifications