Closed jmadden91 closed 6 months ago
Thank you for the suggestion, I think ollama would be a great addition. The reason it doesn't work with localai is because the use different endpoints in their api. However, this should be entirely doable. I think I can ship this in the next release.
Ollama is now supported as of v0.3.5
Thank you for the suggestion, I think ollama would be a great addition. The reason it doesn't work with localai is because the use different endpoints in their api. However, this should be entirely doable. I think I can ship this in the next release.
Is there something I can help with? LocalAI should be entirely compatible with vision API, and we test that against our CI
@jmadden91 care to share the full error?
Ollama is now supported as of v0.3.5
Amazing, thank you so much. Works perfectly
Thank you for the suggestion, I think ollama would be a great addition. The reason it doesn't work with localai is because the use different endpoints in their api. However, this should be entirely doable. I think I can ship this in the next release.
Is there something I can help with? LocalAI should be entirely compatible with vision API, and we test that against our CI
@jmadden91 care to share the full error?
I don't have localAI running sorry, but as far as I'm aware it works with this integration. The error I had above was due to me trying to use the localAI functionality of this integration with my ollama installation.
Thanks for your work with this integration. Works perfectly with openai.
Would it be possible to add ollama as a supported provider? I tried adding it using the "localAI" provider with the port changed to 11434 and while it adds fine, I get a 400 error when trying to run the service.
Here is the ollama blog post about vision support: https://ollama.com/blog/vision-models
Cheers!