laserlemon / figaro

Simple Rails app configuration
MIT License
3.77k stars 287 forks source link

Options for Figaro on Elastic BeanStalk #273

Open tomgallagher opened 5 years ago

tomgallagher commented 5 years ago

Hello

Thanks for Figaro.

I'm thinking about moving one of my apps from Heroku to Elastic Beanstalk on AWS.

Figaro seems to be Heroku-focused. As far as I can see, I have a few options to make Figaro work on Elastic Beanstalk.

1) Copy the environment variables into my Elastic Beanstalk environment one-by-one.

I'd like to avoid doing this if possible because I have a lot of them and it is bound to be error-prone.

2) Just remove application.yml from the gitignore file.

I haven't tested this yet but this seems like the route of least resistance. It obviously undercuts the whole purpose of using Figaro in the first place though.

3) Generate a remote configuration file

You mention this is in the docs but I can't see any examples. I'm not sure what this means. Could you elaborate a bit on how Figaro could retrieve a file from an S3 bucket, for example?

Thanks

Tom

dgarwood commented 4 years ago

@tomgallagher i certainly wouldn't do 2. I just dealt with a project where all the keys were in the credentials.yml.enc (yes, including dev and test) and a dev's laptop was stolen. even encrypted, the keys were still committed to the repo and had to be changed.

I'm currently working on an s3 approach to getting files on a system for config and I'll let you know how that goes.

dgarwood commented 4 years ago

@tomgallagher (and those who might find this later) follow up from my previous comment.

we used the aws-s3 sdk and have copies of our files on an s3 bucket. Then, during deploy, our capistrano task pulls down the file to the shared folder from the bucket. The main thing to leverage is SSHKit's file stream for upload, since that is what the s3 object will return. This is one method that works, and not the only way to accomplish this.

# taken from a Capistrano task
s3 = Aws::S3::Resource.new(
       region: fetch(:aws_region),
       credentials: Aws::Credentials.new(
         fetch(:aws_access_key),
         fetch(:aws_secret_key)
       )
     )
obj = s3.bucket(:aws_s3_bucket).object(:s3_file_path)
upload! obj.get.body, :server_file_path