cloudposse / terraform-aws-ssm-patch-manager

Terraform module to provision AWS SSM Patch Manager maintenance window tasks, targets, patch baseline, patch groups and an s3 bucket for storing patch task logs
https://cloudposse.com/accelerate
Apache License 2.0
22 stars 17 forks source link

Module won't deploy, S3 error #26

Closed r1ddl3 closed 4 months ago

r1ddl3 commented 1 year ago

Describe the Bug

I am trying to deploy the module but when I do it plans OK, but when you apply you get the following error:

│ Error: error creating S3 bucket ACL for terraform-20230520132723867900000001: AccessControlListNotSupported: The bucket does not allow ACLs │ status code: 400, request id: 18G8BJ3XHY7KBXPD, host id: rqJmcHZRRAG4Dfqi8j8TER68yMMrJqJX+zMNf0HAjbZrtnsXDeC3rSZZsVg6Oi81xM4pmGNlPtlY5DIdAkfmcg== │ │ with module.ssm_patch_manager.module.ssm_patch_log_s3_bucket[0].aws_s3_bucket_acl.default[0], │ on .terraform/modules/ssm_patch_manager.ssm_patch_log_s3_bucket/main.tf line 148, in resource "aws_s3_bucket_acl" "default": │ 148: resource "aws_s3_bucket_acl" "default" { │ ╵ ╷ │ Error: Error putting S3 policy: MalformedPolicy: Policy has invalid resource │ status code: 400, request id: PGSC0P362TF907VA, host id: LBsvglVnQk4d4LMQpGWS3Q3E/iH4RdR9qLYJmnpzBcsb1BLEiiBHiN9F9zrjXk9Og5ciDabgsmE= │ │ with module.ssm_patch_manager.module.ssm_patch_log_s3_bucket[0].aws_s3_bucket_policy.default[0], │ on .terraform/modules/ssm_patch_manager.ssm_patch_log_s3_bucket/main.tf line 446, in resource "aws_s3_bucket_policy" "default": │ 446: resource "aws_s3_bucket_policy" "default" { │ ╵

I suspect this is a problem to do with the changes to S3 bucket policies from AWS https://aws.amazon.com/blogs/aws/heads-up-amazon-s3-security-changes-are-coming-in-april-of-2023/

Expected Behavior

The module should spin up as expected in the plan

Steps to Reproduce

Just run the module.

Screenshots

No response

Environment

No response

Additional Context

No response

Gowiem commented 4 months ago

@r1ddl3 this may have been an issue when the S3 bucket policy changes first came out, but it is no longer which is likely due to updates to the consumed s3-bucket module that has been upgraded + patched. Can you please check or share if you're still experiencing this type of issue and reopen if so? Thanks!