Closed dmoore44 closed 2 months ago
The same problem occurred. Wating for solutions.
@mingtaoloreal Got it - take a look in chains.py (lines 29-56 are the relevant ones); If you're using Ollama to serve up your model, you'll need to chance you EMBEDDING_MODEL value to ollama in your .env file.
The default value for the EMBEDDING_MODEL conflicts with the LLM embedding model vector and dimension values.
Standing up the genai-stack fails when using
docker compose up
ordocker compose up --build
- specifically, it's failing when standing up the genai-stack-api-1 container.Here's the truncated output when trying to build...