When attempting to start up a pipelines container with the llamaindex_ollama_pipeline pre-included via the PIPELINES_URLS env var, the container fails to start, and the logs shows that it errors out with the following error:
open-webui-pipelines | [notice] A new release of pip is available: 24.0 -> 24.1.1
open-webui-pipelines | [notice] To update, run: pip install --upgrade pip
open-webui-pipelines | No requirements found in frontmatter of ./pipelines/openai_dalle_manifold_pipeline.py.
open-webui-pipelines | No requirements found in frontmatter of ./pipelines/wikipedia_pipeline.py.
open-webui-pipelines | INFO: Started server process [110]
open-webui-pipelines | INFO: Waiting for application startup.
Downloading readme: 100%|██████████| 46.0/46.0 [00:00<00:00, 645kB/s]
open-webui-pipelines | Loaded module: haystack_pipeline
open-webui-pipelines | Loaded module: wikipedia_pipeline
open-webui-pipelines | Loaded module: anthropic_manifold_pipeline
open-webui-pipelines | Loaded module: openai_dalle_manifold_pipeline
open-webui-pipelines | Loaded module: llamaindex_ollama_pipeline
Downloading data: 100%|██████████| 119k/119k [00:00<00:00, 785kB/s]
Generating train split: 100%|██████████| 151/151 [00:00<00:00, 61441.59 examples/s]
open-webui-pipelines | /usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
open-webui-pipelines | warnings.warn(
Batches: 100%|██████████| 5/5 [00:02<00:00, 2.28it/s]
open-webui-pipelines | ERROR: Traceback (most recent call last):
open-webui-pipelines | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 732, in lifespan
open-webui-pipelines | async with self.lifespan_context(app) as maybe_state:
open-webui-pipelines | File "/usr/local/lib/python3.11/contextlib.py", line 210, in __aenter__
open-webui-pipelines | return await anext(self.gen)
open-webui-pipelines | ^^^^^^^^^^^^^^^^^^^^^
open-webui-pipelines | File "/app/main.py", line 211, in lifespan
open-webui-pipelines | await on_startup()
open-webui-pipelines | File "/app/main.py", line 190, in on_startup
open-webui-pipelines | await module.on_startup()
open-webui-pipelines | File "/app/./pipelines/llamaindex_ollama_pipeline.py", line 34, in on_startup
open-webui-pipelines | self.documents = SimpleDirectoryReader("./data").load_data()
open-webui-pipelines | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
open-webui-pipelines | File "/usr/local/lib/python3.11/site-packages/llama_index/core/readers/file/base.py", line 260, in __init__
open-webui-pipelines | raise ValueError(f"Directory {input_dir} does not exist.")
open-webui-pipelines | ValueError: Directory ./data does not exist.
open-webui-pipelines |
open-webui-pipelines | ERROR: Application startup failed. Exiting.
open-webui-pipelines | on_startup:wikipedia_pipeline
open-webui-pipelines | on_startup:anthropic_manifold_pipeline
open-webui-pipelines | on_startup:openai_dalle_manifold_pipeline
When attempting to start up a pipelines container with the
llamaindex_ollama_pipeline
pre-included via thePIPELINES_URLS
env var, the container fails to start, and the logs shows that it errors out with the following error:Any ideas?