Closed Mayorc1978 closed 6 months ago
Tested with nomic-embed-text and Ollama and works as intended.
Hey @Mayorc1978 where can I find the file or configuration to use it with Ollama? Hope you have a fantastic day!
@Mayorc1978 can you please provide the changes you made to make it work with Ollama...thanks!
# Get Ollama host and embedding model from environment variables
# Set default value for OLLAMA_HOST
ollama_host = os.getenv("OLLAMA_HOST", "http://host.docker.internal:11434")
# Set default value for OLLAMA_EMBEDING_MODEL
ollama_embedding_model = os.getenv("OLLAMA_EMBEDING_MODEL", "nomic-embed-text")
_embeddings = OllamaEmbeddings(model=ollama_embedding_model, base_url=ollama_host)
change embeddings.py works for me
@zx9597446 that will only change the embedding model, if at all.
Using docker and setting the BASE_URL to my host LLM running inside LMStudio I was able to make it work, at least part of it.
But it stops to work the moment he searches for embeddings:
[2024-03-15 22:08:09.682] [ERROR] Unexpected endpoint or method. (POST /v1/embeddings). Returning 200 anyway
So, does it search for a specific embedding model? Or any available models? Cause in the latter case it could work using Ollama, since Ollama has embeddings Nomic Embedding for instance is pretty fast, has better performance text-embedding-ada-002 and text-embedding-3-small performance on short and long context tasks.
Usually accessed with this: