opea-project / GenAIExamples

Generative AI Examples is a collection of GenAI examples such as ChatQnA, Copilot, which illustrate the pipeline capabilities of the Open Platform for Enterprise AI (OPEA) project.
https://opea.dev
Apache License 2.0
285 stars 195 forks source link

[Bug] langchain not installed successfully in chatqna-llm-uservice for dependency conflicts #915

Open shaohef opened 1 month ago

shaohef commented 1 month ago

Priority

P1-Stopper

OS type

Ubuntu

Hardware type

Xeon-GNR

Installation method

Deploy method

Running nodes

Single Node

What's the version?

Containers: chatqna: Container ID: containerd://0167fdbc7922532c6a4c2f9740432b59734973ee5128c6abc9e1f9fc698610a3 Image: opea/llm-tgi:latest Image ID: sha256:b9587f11aa60f0265bc79d93bc8735e2dbf2b99fa021c40d3133f65e10faaa96

Description

kubectl logs chatqna-llm-uservice-dbf756484-pw6lj

Downloading requests_toolbelt-1.0.0-py2.py3-none-any.whl.metadata (14 kB) Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.11/site-packages (from requests<3,>=2->langsmith<0.2.0,>=0.1.125->langchain-core<0.4,>=0.3->langserve->-r requirements-runtime.txt (line 1)) (3.3.2) Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.11/site-packages (from requests<3,>=2->langsmith<0.2.0,>=0.1.125->langchain-core<0.4,>=0.3->langserve->-r requirements-runtime.txt (line 1)) (2.1.0) Downloading langserve-0.3.0-py3-none-any.whl (1.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 2.8 MB/s eta 0:00:00 Downloading langchain_core-0.3.9-py3-none-any.whl (401 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 401.8/401.8 kB 1.9 MB/s eta 0:00:00 Downloading pydantic-2.9.2-py3-none-any.whl (434 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 434.9/434.9 kB 2.1 MB/s eta 0:00:00 Downloading pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.1 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.1/2.1 MB 9.4 MB/s eta 0:00:00 Downloading langsmith-0.1.132-py3-none-any.whl (294 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 294.6/294.6 kB 1.4 MB/s eta 0:00:00 Downloading requests_toolbelt-1.0.0-py2.py3-none-any.whl (54 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 54.5/54.5 kB 220.1 kB/s eta 0:00:00 Installing collected packages: pydantic-core, requests-toolbelt, pydantic, langsmith, langchain-core, langserve WARNING: The script langsmith is installed in '/home/user/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. langchain 0.1.0 requires langchain-core<0.2,>=0.1.7, but you have langchain-core 0.3.9 which is incompatible. langchain 0.1.0 requires langsmith<0.1.0,>=0.0.77, but you have langsmith 0.1.132 which is incompatible. langchain-community 0.0.9 requires langchain-core<0.2,>=0.1.7, but you have langchain-core 0.3.9 which is incompatible. langchain-community 0.0.9 requires langsmith<0.1.0,>=0.0.63, but you have langsmith 0.1.132 which is incompatible. Successfully installed langchain-core-0.3.9 langserve-0.3.0 langsmith-0.1.132 pydantic-2.9.2 pydantic-core-2.23.4 requests-toolbelt-1.0.0

[notice] A new release of pip is available: 24.1.2 -> 24.2 [notice] To update, run: pip install --upgrade pip /home/user/.local/lib/python3.11/site-packages/pydantic/_internal/_fields.py:132: UserWarning: Field "model_name_orpath" in Audio2TextDoc has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( [2024-10-08 08:48:50,757] [ INFO] - CORS is enabled. [2024-10-08 08:48:50,758] [ INFO] - Setting up HTTP server [2024-10-08 08:48:50,759] [ INFO] - Uvicorn server setup on port 9000 INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:9000 (Press CTRL+C to quit) [2024-10-08 08:48:50,773] [ INFO] - HTTP server setup successful

Reproduce steps

kubectl apply -f ./

Raw log

No response

lianhao commented 1 month ago

This is not a bug related to K8S. This should be related to GenAIComps opea/llm-tgi container image itself. Please remove me and set the appropriate code owner please.

lianhao commented 1 month ago

@shaohef please update your local container image opea/llm-tgi to the latest, and try it again. I believe it should be resolved by PR opea-project/GenAIComps#704

xiguiw commented 1 month ago

@shaohef please update your local container image opea/llm-tgi to the latest, and try it again. I believe it should be resolved by PR opea-project/GenAIComps#704

@shaohef Any progress? Did you try above suggestion?