taketwo / llm-ollama

LLM plugin providing access to local Ollama models using HTTP API
Apache License 2.0
89 stars 6 forks source link

Support for embedding models installed through Ollama #14

Open javierhb23 opened 1 month ago

javierhb23 commented 1 month ago

The llm library offers support for working with embedding models through plugins such as llm sentence transformers I'm curious to know if there are plans for adding support for embedding models installed through Ollama (e.g.: ollama pull mxbai-embed-large) for this project, or whether this would be provided by a separate plugin altogether? Edit: Sorry, I just noticed the embed branch, looks like it's a wip. Feel free to close this issue.

taketwo commented 1 month ago

Yes, I did an experiment, and I think it worked, but I've never finished and merged it. Currently, I don't have a personal use case for this, so it's not on my list. But contributions are always welcome!