Open arun-gupta opened 2 weeks ago
@arun-gupta The architecture design is to open up gateway port only.
Let me try to get a AWS platform first. Then we can get to know the networks settings on AWS, and we can try and verify the port setting for the GenAIExamples.
I've been playing with the setup on AWS. I logged into the EC2 instance and could run a couple of tests locally without opening any ports.
It seems like the ports only needs to be opened up if you do curl
from the local machine. The instructions already mention that but it is easy to miss. Can we remove the instructions to open up the port otherwise someone would accidentally do that in production?
Here are my evolving instructions: https://gist.github.com/arun-gupta/7e9f080feff664fbab878b26d13d83d7
@arun-gupta
Do you have AWS platform shared with us so I can investigate this issue? We have no AWS platform.
@xiguiw sure, send me an internal Teams message and lets set up offline time to work through this.
Priority
Undecided
OS type
Ubuntu
Hardware type
Xeon-SPR
Installation method
Deploy method
Running nodes
Single Node
What's the version?
0.9
Description
I'm trying to follow the instructions at https://github.com/opea-project/GenAIExamples/tree/main/ChatQnA/docker/xeon.
Why do we need to open up all these ports in the EC2 Security Group? These services should be talking to each other within the VPC only.
Reproduce steps
Curious about the need to open up the ports before I start implementing this.
Raw log
No response