Generative AI Examples is a collection of GenAI examples such as ChatQnA, Copilot, which illustrate the pipeline capabilities of the Open Platform for Enterprise AI (OPEA) project.
1) While the other services refer to BACKEND_SERVICEENDPOINT, etc this refers to APP variants
2) the proxy related environment variables missing
3) why is the dataprep dependency not mentioned, just chatQnA backend. Asking because the UI supports uploading documents, don't they need to be routed to the dataprep service for chunking etc.
Things are not working for me thus asking these questions.
Here are the modification for compose.yaml file following the README.
here is my compose.yaml file.
compose.txt
Could you try again with same file or similar modification?
Priority
P3-Medium
OS type
Ubuntu
Hardware type
Xeon-ICX
Installation method
Deploy method
Running nodes
Single Node
What's the version?
The host VM is not a concern here. Just the README https://github.com/opea-project/GenAIExamples/blob/main/ChatQnA/docker_compose/intel/cpu/xeon/README.md
1) While the other services refer to BACKEND_SERVICEENDPOINT, etc this refers to APP variants 2) the proxy related environment variables missing 3) why is the dataprep dependency not mentioned, just chatQnA backend. Asking because the UI supports uploading documents, don't they need to be routed to the dataprep service for chunking etc. Things are not working for me thus asking these questions.
chaqna-xeon-conversation-ui-server: image: opea/chatqna-conversation-ui:latest container_name: chatqna-xeon-conversation-ui-server environment:
Description
Foloow ChatQnA readme to customize chatQnA compose file to use conversational ui
Reproduce steps
edit compose.yaml and user docker to restart (either with images in docker hub or built locally)
Raw log