hashicorp / terraform

Terraform enables you to safely and predictably create, change, and improve infrastructure. It is a source-available tool that codifies APIs into declarative configuration files that can be shared amongst team members, treated as code, edited, reviewed, and versioned.
https://www.terraform.io/
Other
42.34k stars 9.49k forks source link

AccessDenied: Switching to default workspace creates a temporary statefile and terraform checks its permissions which breaks other workspaces #24792

Open sufiyanghori opened 4 years ago

sufiyanghori commented 4 years ago

Terraform Version

Terraform v0.12.24

Details

When a terraform project is initialized for the first time, it makes a recursive s3/GetObject call on the parent object in S3 backend bucket as well, if the parent has an object with the same name as the state file (This parent object is automatically created when switched to default workspace).

For example,

Suppose AWS Account A has an S3 backend bucket called my-terraform-bucket. The bucket has a single workspace called development and the state file called my.state.file.

So the state file path becomes,

s3://my-terraform-bucket/env:/development/my.state.file

Now, I want to provide role arn:aws:iam::1234567890:role/MyDevelopmentRole permission to be able to access the state file. So the corresponding S3 backend configuration becomes,

bucket   = "my-terraform-bucket"
key         = "my.state.file"
region    = "ap-southeast-2"
role_arn = "arn:aws:iam::1234567890:role/MyDevelopmentRole"

and bucket has following policy attached,

{
    "Version": "2012-10-17",
    "Id": "Policy1587342740129",
    "Statement": [
        {
            "Sid": "Stmt1587342264948",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::1234567890:role/MyDevelopmentRole"
                ]
            },
            "Action": "s3:ListBucket",
            "Resource": "arn:aws:s3:::my-terraform-bucket"
        },
        {
            "Sid": "Stmt1587342562070",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::1234567890:role/MyDevelopmentRole"
                ]
            },
            "Action": [
                "s3:GetObject",
                "s3:PutObject"
            ],
            "Resource": "arn:aws:s3:::arn:aws:s3:::my-terraform-bucket/env:/development/*"
        }

Now if I do terraform init, it will work. However, if I switch to default workspace i.e terraform12 workspace select default it will create a new state file at the root of the backend bucket. i.e.

s3://my-terraform-bucket/my.state.file

The terraform init (if the local .terraform has been removed before running it) will fail, because it will try to execute GetObject on s3://my-terraform-bucket/my.state.file instead of s3://my-terraform-bucket/env:/development/my.state.file. And throws following error,

Error refreshing state: AccessDenied: Access Denied
2020/04/28 17:34:02 [DEBUG] plugin: waiting for all plugin processes to complete...
    status code: 403, request id: 234BF6FBBBE, host id: 58X3SuY2Qiq9Wu/OQ9zZ7jqAVgMiOe6UpjjDTU=

Terraform Configuration Files

bucket   = "my-terraform-bucket"
key         = "my.state.file"
region    = "ap-southeast-2"
role_arn = "arn:aws:iam::1234567890:role/MyDevelopmentRole"

Debug Output

---[ REQUEST POST-SIGN ]-----------------------------
GET /my.state.file HTTP/1.1

Crash Output

Error refreshing state: AccessDenied: Access Denied
2020/04/28 17:34:02 [DEBUG] plugin: waiting for all plugin processes to complete...
    status code: 403, request id: 234BF6FBBBE, host id: 58X3SuY2Qiq9Wu/OQ9zZ7jqAVgMiOe6UpjjDTU=

Expected Behavior

Terraform should have checked access on s3://my-terraform-bucket/env:/development/my.state.file and should have successfully initialized.

Actual Behavior

Terraform checks if it can get s3://my-terraform-bucket/my.state.file and fails, because bucket policy restricts access to specific key only.

Steps to Reproduce

  1. Create a new bucket for S3 backend ,

  2. Add policy so that your role can access only the state file, and nothing else from the parent directory,

    "Action": [
         "s3:GetObject",
          "s3:PutObject",
    ],
    "Resource": "arn:aws:s3:::arn:aws:s3:::your-bucket-name/env:/development/*"
  3. Create a new project with following s3 backend config,

    bucket   = "your-bucket-name"
    key         = "my.state.file"
    region    = "ap-southeast-2"
    role_arn = "arn:aws:iam::1234567890:role/MyDevelopmentRole"
  4. Do terraform init and create a workspace called development.

  5. Delete .terraform directory and re-run terraform init. This will work.

  6. Go to your S3 bucket and upload an empty file with the same name as the terraform state file, in the bucket such that the path of the file is s3://your-bucket-name/my.state.file

  7. Delete .terraform directory and re-run terraform init. This will not work.

Additional Context

References

danieldreier commented 4 years ago

We are no longer investigating issues reported against Terraform 0.11.x. Terraform 0.12 has been available since May of 2019, and there are really significant benefits to adopting it. We're actively working on Terraform 0.13. I know that adopting 0.12 can require a bit of effort, but it really is worth it, and the upgrade path is pretty well understood in the community by now.

This is otherwise a really well formed issue report, and I appreciate the careful writeup. Can you please try reproducing this issue on 0.12.24?

sufiyanghori commented 4 years ago

We are no longer investigating issues reported against Terraform 0.11.x. Terraform 0.12 has been available since May of 2019, and there are really significant benefits to adopting it. We're actively working on Terraform 0.13. I know that adopting 0.12 can require a bit of effort, but it really is worth it, and the upgrade path is pretty well understood in the community by now.

This is otherwise a really well formed issue report, and I appreciate the careful writeup. Can you please try reproducing this issue on 0.12.24?

Thank you daniel for your comment, just tried and I can reproduce it in version 0.12.24 as well.

danieldreier commented 4 years ago

Thanks for reproducing that on 0.12.24 so quickly, @sufiyanghori! I think this is not a Terraform bug, but rather a configuration mismatch between how you've configured AWS and Terraform.

if I use

backend "s3" {
   bucket   = "my-terraform-bucket"
   key         = "my.state.file"
   region    = "ap-southeast-2"
   role_arn = "arn:aws:iam::1234567890:role/MyDevelopmentRole"
}

My expectation is that terraform would try to store state in arn:aws:s3:::arn:aws:s3:::my-terraform-bucket/my.state.file. The IAM policy you're describing only grants "s3:PutObject" permissions inside the "arn:aws:s3:::arn:aws:s3:::my-terraform-bucket/env:/development/*" path. This is also at odds with the s3://my-terraform-bucket/env:/development/my.state.file path you described.

It looks to me like you have configured Terraform to put state in one place in S3, and configured an IAM policy that does not grant write access to that place.

Based on what you've described, I think you're looking for something more like:

backend "s3" {
   bucket   = "my-terraform-bucket"
   key         = "env/development/my.state.file"
   region    = "ap-southeast-2"
   role_arn = "arn:aws:iam::1234567890:role/MyDevelopmentRole"
}

Additionally, I suspect your IAM policy should omit the extra : in my-terraform-bucket/env:/development/* because that seems like a weird thing to have in an object name.

"Resource": "arn:aws:s3:::arn:aws:s3:::my-terraform-bucket/env:/development/*"

Based on the information you've provided, if I'm understanding this right, I don't immediately see evidence of a bug in Terraform core. It's possible that there's a defect, but there's not enough evidence here for me to treat it as one.

If the suggestions I've provided aren't enough to resolve your issue, I think your best bet is to seek support on the community forum. We use GitHub issues for tracking bugs and enhancements, rather than for questions. While we can sometimes help with certain simple problems here, it's better to use the community forum where there are more people ready to help. The GitHub issues here are monitored only by our few core maintainers.

Can you take another look at those paths and see whether this still looks like a bug to you?

sufiyanghori commented 4 years ago

Thank you @danieldreier for your response.

env: is a prefix added by default,

workspace_key_prefix - (Optional) The prefix applied to the state path inside the bucket. This is only relevant when using a non-default workspace. This defaults to "env:"

Source: https://www.terraform.io/docs/backends/types/s3.html

With following default configuration,

backend "s3" {
   bucket   = "my-terraform-bucket"
   key         = "my.state.file"
   region    = "ap-southeast-2"
   role_arn = "arn:aws:iam::1234567890:role/MyDevelopmentRole"
}

If you run terraform workspace new development, it will create a new workspace with a state file at s3://my-terraform-bucket/env:/development/my.state.file, not s3://my-terraform-bucket/my.state.file.

This process at no point creates a statefile at s3://my-terraform-bucket/my.state.file, but it will check the permission of it, if it exists.

Also terraform recommend to use policies as such,

Terraform will need the following AWS IAM permissions on the target backend bucket:

s3:ListBucket on arn:aws:s3:::mybucket
s3:GetObject on arn:aws:s3:::mybucket/path/to/my/key
s3:PutObject on arn:aws:s3:::mybucket/path/to/my/key

Source: https://www.terraform.io/docs/backends/types/s3.html

danieldreier commented 4 years ago

ah, thanks. My apologies, I'm a (relatively new) engineering manager on this project and haven't used workspaces together with the S3 backend, so I didn't understand that path. Thanks for clarifying.

I think I'm a bit out of my depth here, and this backend is maintained by the Terraform AWS Provider team, so I'm going to label it for their attention and let them pick up triage.

jmoberly commented 4 years ago

My team has hit issues with this as well. Where a statefile is accidentally created at the root of the S3 bucket and if there is a newer TF version or a unique provider in that file it will block all of our other projects from running terraform init. We provide the workspace_key_prefix on every project.

sufiyanghori commented 4 years ago

My team has hit issues with this as well. Where a statefile is accidentally created at the root of the S3 bucket and if there is a newer TF version or a unique provider in that file it will block all of our other projects from running terraform init. We provide the workspace_key_prefix on every project.

I have updated the ticket with more findings. I have realised that statefile at the root of the bucket is not created accidentally, it is rather created when you switch to default workspace. This breaks permission checks for other workspaces.