Generative AI Examples is a collection of GenAI examples such as ChatQnA, Copilot, which illustrate the pipeline capabilities of the Open Platform for Enterprise AI (OPEA) project.
When I run the command to run the E2E service with tgi backend and re-rank: docker compose -f compose.yaml up -d after having set my environment variables (I am not on proxy network) and building each docker image, I get a port error for the tgi-service, even though there is no process occuring on this port. The error can be seen in the attached image. This bars me from launching the system entirely.
Priority
P1-Stopper
OS type
Ubuntu
Hardware type
Xeon-SPR
Installation method
Deploy method
Running nodes
Single Node
What's the version?
https://github.com/opea-project/GenAIExamples/commit/3c164f3aa25bae957e436a09628235e1a11d6e8d
Description
When I run the command to run the E2E service with tgi backend and re-rank:
docker compose -f compose.yaml up -d
after having set my environment variables (I am not on proxy network) and building each docker image, I get a port error for the tgi-service, even though there is no process occuring on this port. The error can be seen in the attached image. This bars me from launching the system entirely.Reproduce steps
Followed these instructions exactly to run megaservice with re-ranker using tgi-backend starting from here: https://github.com/opea-project/GenAIExamples/tree/main/ChatQnA/docker_compose/intel/cpu/xeon#-build-docker-images
Raw log
No response