valentinfrlch / ha-llmvision

Let Home Assistant see!
Apache License 2.0
297 stars 12 forks source link

Add Ollama vision support #4

Closed jmadden91 closed 6 months ago

jmadden91 commented 6 months ago

Thanks for your work with this integration. Works perfectly with openai.

Would it be possible to add ollama as a supported provider? I tried adding it using the "localAI" provider with the port changed to 11434 and while it adds fine, I get a 400 error when trying to run the service.

Here is the ollama blog post about vision support: https://ollama.com/blog/vision-models

image

Cheers!

valentinfrlch commented 6 months ago

Thank you for the suggestion, I think ollama would be a great addition. The reason it doesn't work with localai is because the use different endpoints in their api. However, this should be entirely doable. I think I can ship this in the next release.

valentinfrlch commented 6 months ago

Ollama is now supported as of v0.3.5

mudler commented 6 months ago

Thank you for the suggestion, I think ollama would be a great addition. The reason it doesn't work with localai is because the use different endpoints in their api. However, this should be entirely doable. I think I can ship this in the next release.

Is there something I can help with? LocalAI should be entirely compatible with vision API, and we test that against our CI

@jmadden91 care to share the full error?

jmadden91 commented 6 months ago

Ollama is now supported as of v0.3.5

Amazing, thank you so much. Works perfectly

jmadden91 commented 6 months ago

Thank you for the suggestion, I think ollama would be a great addition. The reason it doesn't work with localai is because the use different endpoints in their api. However, this should be entirely doable. I think I can ship this in the next release.

Is there something I can help with? LocalAI should be entirely compatible with vision API, and we test that against our CI

@jmadden91 care to share the full error?

I don't have localAI running sorry, but as far as I'm aware it works with this integration. The error I had above was due to me trying to use the localAI functionality of this integration with my ollama installation.