Closed dms1981 closed 1 month ago
Thursday I was milk monitor, and today have been looking at this slightly, but decided to learn a bit more about s3s new features for my ten percent time. Progress on this is.. well nothing apart from me learning a bit. I might commit something by end of play today, but am off next week.
A few ideas that might aleady be implemented in the module but I shall find out
With the changes below, you would call the module like this now
module "s3_bucket" {
source = "github.com/ministryofjustice/modernisation-platform-terraform-s3-bucket"
bucket = "my-bucket"
conditions = {
IpAddress = {
"aws:SourceIp" = "192.0.2.0/24"
}
}
}
Ive been working on the s3 module changes for replication, and canned this until that change is released, as it will be a breaking change.
Placed in blocked until @ep-93 returns from leave
I've done a little more reading on this, and the approach is still valid. Securing an object with this condition will still work, even though we've made some changes since this issue was written. The thing that this solves is the ability for a user to move laterally across roles like so:
Role X
in Account A that has access to Object B
Role Y
that has access to Object C
aws:PrincipalTag
remains consistent as that of Account B, this will not match the value in the object tag and access to Object C
will be deniedSo, I can do the conditions, and dynamically, however the exact request errors out with
User: arn:aws:sts::----------assumed-role/AWSReservedSSO_AdministratorAccess_a7491ea25c15715b/ep-93@digital.justice.gov.uk is not authorized to perform: s3:PutBucketPolicy on resource: "arn:aws:s3:::s3-bucket20240903132505533800000001" because public policies are blocked by the BlockPublicPolicy block public access setting.
Messaging who raised the ticket for advice.
Case ID 172544793000638 raised with AWS
So I have talked to AWS and David S about this
PR is here to start it, and it would work great however there is a step needed before this, and with the complexity I am not sure its worth it.
https://github.com/ministryofjustice/modernisation-platform/pull/7860/files
So, firstly, what this was meant to protect against, is already protected against, it cannot be done anymore.
This was to add another layer, this layer however would require us to change the terraform state bucket and tag each file (state file) with a tag called PermittedAccount
currently. This would need to be populated with the account number relating to that account. E.g. PPUD development statefile would need the PPUD development account number as a tag.
Whilst this has been done with new files, and new buckets, editing the current one to do this is a bit scary. I'm not saying it cant be done, but I'm questioning the reason for it now we have already circumvented this final security issue.
Notes
join(",", nonsensitive(local.environment_accounts["ppud"]))
Raising an ADR for this.
User Story
As a Modernisation Platform Customer I expect to have secure access to sensitive objects in S3 So that I am protected against unexpected disclosures
Value / Purpose
While we can control access to objects in S3 such as terraform state files through bucket policies, we can risk accidental disclosure if bucket policies & IAM policies are not suitably strict.
We should consider the value of reviewing our bucket policies with the assistance of AWS support, and implementing security using a condition block similar to the one below:
Useful Contacts
No response
Additional Information
https://github.com/ministryofjustice/modernisation-platform-environments/commit/bf53e19e71977f24a2e622e5f50230fd44fe4530 << commit showing the implementation of tag-secured objects
https://github.com/ministryofjustice/modernisation-platform/commit/fb0cfaf351119590326eb1a163cdb20d6b66e27a << commit showing an example of how to create a local value that contains a map of maps with key/value pairs equal to account-alias/account-id
Proposal / Unknowns
Definition of Done