open-webui / pipelines

Pipelines: Versatile, UI-Agnostic OpenAI-Compatible Plugin Framework
MIT License
931 stars 297 forks source link

Unable to pip install llama-index-embeddings-ollama leading to pipeline problems #237

Open UmiVilbig opened 2 months ago

UmiVilbig commented 2 months ago

I used /examples/pipelines/rag/llamaindex_ollama_github_pipeline.py and uploaded the file through the main UI. I then get a status 500 but upon refreshing the page the pipeline exists in the page.

After the pipeline has been uploaded, if you restart the container you are now in a perpetual container restart loop. The error installing package causes the container to crash, it restarts, and hits the same blocker.

Looking through the logs on docker, it looks like the problem stems from the package llama-index-embeddings-ollama. I tried pip installing it inside the container and I get the error could not find a version that satisfies the requirement llama-index-embeddings-ollama (from versions: none)

This could just be a python version problem

nickibarlowSDI commented 2 months ago

An infinite "updating but incompatible" loop happens to me when I run: docker run -d -p 9099:9099 --add-host=host.docker.internal:host-gateway -v pipelines:/app/pipelines --name pipelines-official --restart always ghcr.io/open-webui/pipelines:main

ex: llama-index-cli 0.1.13 requires llama-index-core<0.11.0,>=0.10.11.post1, but you have llama-index-core 0.11.6 which is incompatible.

I believe this is caused by the

branch_data: GitBranchResponseModel = self._loop.run_until_complete(

and/or the lack of a GITHUB_TOKEN for one of the default pipelines.