cherweg / logstash-input-s3-sns-sqs

logstash input downloading files from s3 Bucket by OjectKey from SNS/SQS
Other
29 stars 35 forks source link

Cross-account - assume_role #14

Closed dhammond22222 closed 5 years ago

dhammond22222 commented 5 years ago

I would like to highlight an issue and see if we can discover what it may take to accomplish this goal. Here's the scenario (this scenario is likely very common through most enterprises that use AWS)

AWS AccountID A contains a set of S3 buckets and SQS queues set up in such a way that would allow this plugin to consume the referenced objects in the SQS message from the S3 bucket. (Event notification from S3 -> SQS queue)

AWS AccountID B contains a local IAM user. This IAM user needs to be able to access AccountID A in order to consume the SQS messages and read the S3 bucket.

In order for this to happen - the most ideal scenario is the following setup:

AWS AccountID A sets up a trust relationship with AWS AccountID B that allows the user from AWS AccountID B to assume role into AWS AccountID A with appropriate permissions to read the SQS/S3.

Logstash is setup in this case to use the credentials for account B , but then to specify assuming role into Account A to read the queue/bucket.

Can we clarify this functionality exists in the plugins current form?

christianherweg0807 commented 5 years ago

Use roles for sqs is not possible at this time. See Line:

https://github.com/cherweg/logstash-input-s3-sns-sqs/blob/41a06c35d922b7d2bc56cdd563af70fead9e54d8/lib/logstash/inputs/s3snssqs.rb#L159

You could use an accountid for connection, but for this you´ll need applicable iam access rights. A possible solution is to introduce a new config analog to s3_role_arn. See:

https://github.com/cherweg/logstash-input-s3-sns-sqs/blob/41a06c35d922b7d2bc56cdd563af70fead9e54d8/lib/logstash/inputs/s3snssqs.rb#L402

Much better for debugging & monitoring purposes is to notify to an SNS topic in Account-A. Then Account B could subscribe with an SQS Queue to this SNS-Queue.

This setup is easy to recover if you have to redeploy you service. See Picture

grafik
dhammond22222 commented 5 years ago

Thanks. I'm now hosting the SQS locally on the AWS account while the S3 and SNS are on the . remote account and encountering the error below:

[2019-05-23T23:45:01,614][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/plugins/request_signer.rb:104:in require_credentials'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/plugins/request_signer.rb:94:insign_authenticated_requests'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/plugins/request_signer.rb:87:in call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/plugins/helpful_socket_errors.rb:10:incall'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/plugins/retry_errors.rb:108:in call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/query/handler.rb:27:incall'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/plugins/user_agent.rb:12:in call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/plugins/endpoint_pattern.rb:27:incall'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/plugins/endpoint_discovery.rb:67:in call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/seahorse/client/plugins/endpoint.rb:41:incall'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/plugins/param_validator.rb:21:in call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/seahorse/client/plugins/raise_response_errors.rb:14:incall'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/plugins/jsonvalue_converter.rb:20:in call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/plugins/idempotency_token.rb:18:incall'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/plugins/param_converter.rb:20:in call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/seahorse/client/plugins/response_target.rb:21:incall'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/seahorse/client/request.rb:70:in send_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/seahorse/client/base.rb:207:inblock in define_operation_methods'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/assume_role_credentials.rb:49:in refresh'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/refreshing_credentials.rb:20:ininitialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.256/lib/aws-sdk-core/assume_role_credentials.rb:40:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-s3-sns-sqs-1.6.1/lib/logstash/inputs/s3snssqs.rb:416:ins3_assume_role'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-s3-sns-sqs-1.6.1/lib/logstash/inputs/s3snssqs.rb:402:in get_s3client'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-s3-sns-sqs-1.6.1/lib/logstash/inputs/s3snssqs.rb:161:insetup_queue'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-s3-sns-sqs-1.6.1/lib/logstash/inputs/s3snssqs.rb:152:in register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:259:inregister_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:270:in block in register_plugins'", "org/jruby/RubyArray.java:1792:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:270:in register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:413:instart_inputs'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:311:in start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:217:inrun'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:176:in `block in start'"], :thread=>"#"}

Here's my input configuration (with the values for the resources and access keys/secret blocked)

input { s3snssqs { access_key_id=>"" secret_access_key=>"" role_arn=>"arn:aws:iam:::role/" s3_role_arn=>"arn:aws:iam:::role/" queue=>"" queue_owner_aws_account_id=>"" s3_key_prefix=>"logs" region=>"us-west-2" sqs_explicit_delete=>true from_sns=>true } }

dhammond22222 commented 5 years ago

Was able to resolve the issue by removing the old s3_role_arn and leaving the role_arn. The plugin can now connect and retrieve the objects from the remote S3 bucket by referencing messages in the local SQS queue.

Thanks for your guidance!

christianherweg0807 commented 5 years ago

Perfekt. Have fun.