Canner / WrenAI

🚀 Open-source SQL AI Agent for Text-to-SQL. Make Text2SQL Easy! 🙌
https://getwren.ai/oss
GNU Affero General Public License v3.0
1.73k stars 155 forks source link

Wren-ai-service keeps restarting #531

Closed Namec999 closed 2 months ago

Namec999 commented 2 months ago

using the latest 7.3 version, when installling everything on windows

i got this error in the wren-ai-service logs :

INFO: Started server process [7] INFO: Waiting for application startup. 2024-07-17 09:56:47,796 - wren-ai-service - INFO - Initializing providers... (utils.py:64) 2024-07-17 09:56:53,253 - wren-ai-service - INFO - Registering provider: qdrant (loader.py:66) 2024-07-17 09:56:55,009 - wren-ai-service - INFO - Registering provider: azure_openai_embedder (loader.py:66) 2024-07-17 09:56:55,035 - wren-ai-service - INFO - Registering provider: ollama_embedder (loader.py:66) 2024-07-17 09:56:55,045 - wren-ai-service - INFO - Registering provider: openai_embedder (loader.py:66) 2024-07-17 09:56:55,055 - wren-ai-service - INFO - Registering provider: wren_ui (loader.py:66) 2024-07-17 09:56:55,056 - wren-ai-service - INFO - Registering provider: wren_ibis (loader.py:66) 2024-07-17 09:56:55,115 - wren-ai-service - INFO - Registering provider: azure_openai_llm (loader.py:66) 2024-07-17 09:56:55,147 - wren-ai-service - INFO - Registering provider: ollama_llm (loader.py:66) 2024-07-17 09:56:55,154 - wren-ai-service - INFO - Registering provider: openai_llm (loader.py:66) 2024-07-17 09:56:55,367 - wren-ai-service - INFO - Pulling Ollama model phi3:mini (loader.py:109) 2024-07-17 09:56:56,163 - wren-ai-service - INFO - Pulling Ollama model phi3:mini: 100% (loader.py:116) 2024-07-17 09:56:56,165 - wren-ai-service - INFO - Using Ollama LLM: phi3:mini (ollama.py:135) 2024-07-17 09:56:56,166 - wren-ai-service - INFO - Using Ollama URL: http://host.docker.internal:11434 (ollama.py:136) 2024-07-17 09:56:56,279 - wren-ai-service - INFO - Pulling Ollama model nomic-emded-text:latest (loader.py:109) ERROR: Traceback (most recent call last): File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 734, in lifespan async with self.lifespan_context(app) as maybe_state: File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter return await anext(self.gen) ^^^^^^^^^^^^^^^^^^^^^ File "/src/main.py", line 28, in lifespan container.init_globals() File "/src/globals.py", line 36, in init_globals llm_provider, embedder_provider, document_store_provider, engine = init_providers() ^^^^^^^^^^^^^^^^ File "/src/utils.py", line 68, in init_providers embedder_provider = loader.get_provider( ^^^^^^^^^^^^^^^^^^^^ File "/src/providers/embedder/ollama.py", line 170, in init pull_ollama_model(self._url, self._embedding_model) File "/src/providers/loader.py", line 111, in pull_ollama_model for progress in client.pull(model_name, stream=True): File "/app/.venv/lib/python3.12/site-packages/ollama/_client.py", line 89, in _stream raise ResponseError(e) ollama._types.ResponseError: pull model manifest: file does not exist

ERROR: Application startup failed. Exiting. Waiting for wren-ai-service to start...

Namec999 commented 2 months ago

and here is my config file :

LLM

LLM_PROVIDER=ollama_llm # openai_llm, azure_openai_llm, ollama_llm GENERATION_MODEL=phi3:mini GENERATION_MODEL_KWARGS={"temperature": 0, "n": 1, "max_tokens": 4096, "response_format": {"type": "json_object"}}

openai or openai-api-compatible

LLM_OPENAI_API_KEY=sk-xxxx LLM_OPENAI_API_BASE=https://api.openai.com/v1

azure_openai

LLM_AZURE_OPENAI_API_KEY= LLM_AZURE_OPENAI_API_BASE= LLM_AZURE_OPENAI_VERSION=

ollama

LLM_OLLAMA_URL=http://host.docker.internal:11434

EMBEDDER

EMBEDDER_PROVIDER=ollama_embedder # openai_embedder, azure_openai_embedder, ollama_embedder

supported embedding models providers by qdrant: https://qdrant.tech/documentation/embeddings/

EMBEDDING_MODEL=nomic-emded-text:latest EMBEDDING_MODEL_DIMENSION=768

openai or openai-api-compatible

EMBEDDER_OPENAI_API_KEY=sk-xxxx EMBEDDER_OPENAI_API_BASE=https://api.openai.com/v1

azure_openai

EMBEDDER_AZURE_OPENAI_API_KEY= EMBEDDER_AZURE_OPENAI_API_BASE= EMBEDDER_AZURE_OPENAI_VERSION=

ollama

EMBEDDER_OLLAMA_URL=http://host.docker.internal:11434

DOCUMENT_STORE

DOCUMENT_STORE_PROVIDER=qdrant

QDRANT_HOST=qdrant

cyyeh commented 2 months ago

@Namec999 I got the reason, there is a typo in your embedding model name, it should be nomic-embed-text:latest

Namec999 commented 2 months ago

my bad lol

this is the problem, and now it works

thank you