Generative AI Examples is a collection of GenAI examples such as ChatQnA, Copilot, which illustrate the pipeline capabilities of the Open Platform for Enterprise AI (OPEA) project.
Please re-organize the material to clearly demarcate
1) what is the chatQnA pipeline
2) deployment options: Docker versus Kubernetes or orchestration option
3) deployment hardware options: Xeon, Gaudi, Nvidia GPU etc
For instance, there are instructions on environment variable set up, is it specific to docker deployments only? Are they relevant to Kubernetes?
With and without GMC on Kubernetes, it is confusing when a link points to different places but reads the same as Kubernetes install. Change the link text appropriately.
https://github.com/opea-project/GenAIExamples/edit/main/ChatQnA/README.md
Please re-organize the material to clearly demarcate 1) what is the chatQnA pipeline 2) deployment options: Docker versus Kubernetes or orchestration option 3) deployment hardware options: Xeon, Gaudi, Nvidia GPU etc
For instance, there are instructions on environment variable set up, is it specific to docker deployments only? Are they relevant to Kubernetes?
With and without GMC on Kubernetes, it is confusing when a link points to different places but reads the same as Kubernetes install. Change the link text appropriately.