TabbyML / tabby

Self-hosted AI coding assistant
https://tabby.tabbyml.com/
Other
21.33k stars 959 forks source link

fix(ollama-api-bindings): Fix model availability check #2367

Closed SpeedCrash100 closed 3 months ago

SpeedCrash100 commented 3 months ago

If model tag is unspecified the check will fail because /api/tags returns model name with :latest appended, so the check will mismatch. However, /api/show handles it correctly.

example Specifying llama3 as chat model was fail because was checked on 'llama3:latest'

My mistake, I've tested it on explicit tags on when implementing because using q5_K_M models. I am sorry, that I did not found it before release of 0.12 :cry: