Open thomasrd1 opened 3 years ago
You can use other buckets as well, only make sure that the EC2 Role has access to read the .env from the S3 bucket.
I don't want to hard code the S3 bucket / URL to .env
I guess this can be solved by using AWS Parameter Store. You can define a parameter store and have a custom script that pulls the secrets from the parameter store into a file using the EC2 Role that the machine has to build your .env
file. This seems promising and might become a great future on this project. 🤔
I remember I have opened an issue a while ago on AWS EB Roadmap repo that fixes this out-of-the-box: https://github.com/aws/elastic-beanstalk-roadmap/issues/57
Infrastructure as code is great. Except when you have to define custom codes. What you have here in this project is amazing.
I try not to hard code any bucket/keys anywhere.
There are 2 different options. Pull the config variables from AWS Secrets Manager or the Environment paramaters. Which means you need to have the elasticbeanstalk environment setup as blank before uploading the code.
I failed at searching some old issue where one of the users found that Environment Variables set in AWS Console got copied over automatically by AWS to a file somewhere in /etc or something and you could just use it as-is.
After a bit of research, I have found out about remind101/ssm-env and it seems like it's a better way to approach the issue, but still doesn't solve multi-env without some bits of hardcoding.
What is the best way to insert the .env in a multi region environment?
The deployment would run... ap-southeast-2 (dev > prod) > us-east-1 > eu-central-1
The example works great for a single region (dev>prod), but how can I automatically get the S3 Bucket name, S3 Bucket path for other regions?
Would it be easier to add some environment variables and then query them?