taketwo / llm-ollama

LLM plugin providing access to local Ollama models using HTTP API
Apache License 2.0
147 stars 8 forks source link

Support for embedding models installed through Ollama #14

Closed javierhb23 closed 1 month ago

javierhb23 commented 3 months ago

The llm library offers support for working with embedding models through plugins such as llm sentence transformers I'm curious to know if there are plans for adding support for embedding models installed through Ollama (e.g.: ollama pull mxbai-embed-large) for this project, or whether this would be provided by a separate plugin altogether? Edit: Sorry, I just noticed the embed branch, looks like it's a wip. Feel free to close this issue.

taketwo commented 3 months ago

Yes, I did an experiment, and I think it worked, but I've never finished and merged it. Currently, I don't have a personal use case for this, so it's not on my list. But contributions are always welcome!

sukhbinder commented 1 month ago

Saw this today. I have added support for the Ollama embedding models as a separate plugin with llm-embed-ollama

javierhb23 commented 1 month ago

Cool! Thanks @sukhbinder, I'll check it out. Closing 👍

taketwo commented 3 weeks ago

Thanks to @zivoy's contribution, we now have embedding support in this plugin: https://github.com/taketwo/llm-ollama/releases/tag/0.7.0