ministryofjustice / modernisation-platform

A place for the core work of the Modernisation Platform • This repository is defined and managed in Terraform
https://user-guide.modernisation-platform.service.justice.gov.uk
MIT License
683 stars 291 forks source link

🔒️ Increase security of sensitive S3 objects #5346

Closed dms1981 closed 1 month ago

dms1981 commented 11 months ago

User Story

As a Modernisation Platform Customer I expect to have secure access to sensitive objects in S3 So that I am protected against unexpected disclosures

Value / Purpose

While we can control access to objects in S3 such as terraform state files through bucket policies, we can risk accidental disclosure if bucket policies & IAM policies are not suitably strict.

We should consider the value of reviewing our bucket policies with the assistance of AWS support, and implementing security using a condition block similar to the one below:

condition {
  test     = "StringEquals"
  variable = "s3:ExistingObjectTag/PermittedAccount"
  values   = ["&{aws:PrincipalAccount}"]
}

Useful Contacts

No response

Additional Information

https://github.com/ministryofjustice/modernisation-platform-environments/commit/bf53e19e71977f24a2e622e5f50230fd44fe4530 << commit showing the implementation of tag-secured objects

https://github.com/ministryofjustice/modernisation-platform/commit/fb0cfaf351119590326eb1a163cdb20d6b66e27a << commit showing an example of how to create a local value that contains a map of maps with key/value pairs equal to account-alias/account-id

Proposal / Unknowns

Definition of Done

davidkelliott commented 11 months ago

https://aws.amazon.com/about-aws/whats-new/2023/11/organization-wide-iam-condition-keys-restrict-aws-service-to-service-requests/

ep-93 commented 5 months ago

Thursday I was milk monitor, and today have been looking at this slightly, but decided to learn a bit more about s3s new features for my ten percent time. Progress on this is.. well nothing apart from me learning a bit. I might commit something by end of play today, but am off next week.

ep-93 commented 5 months ago

A few ideas that might aleady be implemented in the module but I shall find out

  1. Enable Server-Side Encryption Ensure that all data stored in the S3 bucket is encrypted. You can enable server-side encryption using AWS managed keys (SSE-S3) or AWS Key Management Service (SSE-KMS).
  2. Enforce HTTPS for Access Ensure that all communications with the S3 bucket are encrypted by enforcing the use of HTTPS.
  3. Implement Bucket Policies for Least Privilege Access Define bucket policies that limit access based on principles of least privilege, ensuring only authorized users or services can interact with the bucket.
  4. Enable MFA Delete Enable MFA Delete to add an additional layer of security for deleting objects and versions.
  5. Use AWS Config Rules for Continuous Monitoring Set up AWS Config rules to continuously monitor your S3 bucket for compliance with security best practices.
ep-93 commented 4 months ago

With the changes below, you would call the module like this now

module "s3_bucket" {
  source     = "github.com/ministryofjustice/modernisation-platform-terraform-s3-bucket"
  bucket     = "my-bucket"
  conditions = {
    IpAddress = {
      "aws:SourceIp" = "192.0.2.0/24"
    }
  }
}

Changes - https://github.com/ministryofjustice/modernisation-platform-terraform-s3-bucket/commit/7bd7806d0b613c142d7ad5569dab629786bb0f40

Ive been working on the s3 module changes for replication, and canned this until that change is released, as it will be a breaking change.

dms1981 commented 4 months ago

Placed in blocked until @ep-93 returns from leave

dms1981 commented 3 months ago

I've done a little more reading on this, and the approach is still valid. Securing an object with this condition will still work, even though we've made some changes since this issue was written. The thing that this solves is the ability for a user to move laterally across roles like so:

ep-93 commented 1 month ago

So, I can do the conditions, and dynamically, however the exact request errors out with

User: arn:aws:sts::----------assumed-role/AWSReservedSSO_AdministratorAccess_a7491ea25c15715b/ep-93@digital.justice.gov.uk is not authorized to perform: s3:PutBucketPolicy on resource: "arn:aws:s3:::s3-bucket20240903132505533800000001" because public policies are blocked by the BlockPublicPolicy block public access setting.

Messaging who raised the ticket for advice.

ep-93 commented 1 month ago

Case ID 172544793000638 raised with AWS

ep-93 commented 1 month ago

So I have talked to AWS and David S about this

PR is here to start it, and it would work great however there is a step needed before this, and with the complexity I am not sure its worth it.

https://github.com/ministryofjustice/modernisation-platform/pull/7860/files

Plan - https://github.com/ministryofjustice/modernisation-platform/actions/runs/10718606447/job/29720868694?pr=7860

So, firstly, what this was meant to protect against, is already protected against, it cannot be done anymore.

This was to add another layer, this layer however would require us to change the terraform state bucket and tag each file (state file) with a tag called PermittedAccount currently. This would need to be populated with the account number relating to that account. E.g. PPUD development statefile would need the PPUD development account number as a tag.

Whilst this has been done with new files, and new buckets, editing the current one to do this is a bit scary. I'm not saying it cant be done, but I'm questioning the reason for it now we have already circumvented this final security issue.

Notes

join(",", nonsensitive(local.environment_accounts["ppud"]))

ep-93 commented 1 month ago

Raising an ADR for this.