opea-project / GenAIExamples

Generative AI Examples is a collection of GenAI examples such as ChatQnA, Copilot, which illustrate the pipeline capabilities of the Open Platform for Enterprise AI (OPEA) project.
https://opea.dev
Apache License 2.0
220 stars 135 forks source link

[Discussion] Do we need to open up all the security ports for AWS? #698

Open arun-gupta opened 2 weeks ago

arun-gupta commented 2 weeks ago

Priority

Undecided

OS type

Ubuntu

Hardware type

Xeon-SPR

Installation method

Deploy method

Running nodes

Single Node

What's the version?

0.9

Description

I'm trying to follow the instructions at https://github.com/opea-project/GenAIExamples/tree/main/ChatQnA/docker/xeon.

Why do we need to open up all these ports in the EC2 Security Group? These services should be talking to each other within the VPC only.

Reproduce steps

Curious about the need to open up the ports before I start implementing this.

Raw log

No response

xiguiw commented 2 weeks ago

@arun-gupta The architecture design is to open up gateway port only.

Let me try to get a AWS platform first. Then we can get to know the networks settings on AWS, and we can try and verify the port setting for the GenAIExamples.

arun-gupta commented 2 weeks ago

I've been playing with the setup on AWS. I logged into the EC2 instance and could run a couple of tests locally without opening any ports.

It seems like the ports only needs to be opened up if you do curl from the local machine. The instructions already mention that but it is easy to miss. Can we remove the instructions to open up the port otherwise someone would accidentally do that in production?

Here are my evolving instructions: https://gist.github.com/arun-gupta/7e9f080feff664fbab878b26d13d83d7

xiguiw commented 4 days ago

@arun-gupta

Do you have AWS platform shared with us so I can investigate this issue? We have no AWS platform.

arun-gupta commented 4 days ago

@xiguiw sure, send me an internal Teams message and lets set up offline time to work through this.