Closed mkbhanda closed 4 weeks ago
We indeed could use container_name defined in the compose.yaml file and it works.
The only potential issue is having two docker instance with same name, but it should not happen if user uses docker_compose. if no concern to use container_name defined in compose.yaml, we could change readme accordingly. What is your take?
also agree on first request, and will work on that too.
update ChatQnA accordingly in below PR https://github.com/opea-project/GenAIExamples/pull/1018
Let us know any remining issue in the latest main branch. if no, we will close it for now.
Priority
P4-Low
OS type
Ubuntu
Hardware type
CPU-other (Please let us know in description)
Installation method
Deploy method
Running nodes
Single Node
What's the version?
https://github.com/opea-project/GenAIExamples/commit/c930bea172d535ca24c51a4419f08547190747e7
Description
1) For completeness of instructions, particularly when we need to build the docker images, for those specified in GenAIComps, add an instruction to git clone the GenAIComps repo. Also the Docker file refers to the $PYTHON_PATH, need our users to set it.
2) We can improve the documentation in https://github.com/opea-project/GenAIExamples/tree/5dae7137932b0f397e2bfe18bfd2cd94da0dd495/ChatQnA/docker_compose/intel/cpu/xeon by asking the user to set the container_id environment variable. docker logs ${CONTAINER_ID} | grep Connected
Better still if we name the LLM service more generically, without mentioning the model serving framework even, such as "llm_service" the command can be even more generic requiring no environment variable setting.
docker logs llm_service | grep Connected
3) While building the docker images a) for "retriever" and Mega service 1 warning found (use docker --debug to expand):
Reproduce steps
Follow readme guide.
Raw log