hashicorp / terraform

Terraform enables you to safely and predictably create, change, and improve infrastructure. It is a source-available tool that codifies APIs into declarative configuration files that can be shared amongst team members, treated as code, edited, reviewed, and versioned.
https://www.terraform.io/
Other
42.37k stars 9.49k forks source link

aws_s3_bucket_object does not allow accessing buckets in different regions #13228

Closed dynamike closed 7 years ago

dynamike commented 7 years ago

I'm trying to use the aws_s3_bucket_object data source to read an from an S3 bucket in a different region that defined in aws provider.

Terraform Version

Terraform v0.9.1

Affected Resource(s)

Terraform Configuration Files

provider "aws" {
  region = "us-west-2"
}
data "aws_s3_bucket_object" "app" {
  bucket = "${var.vpc_bucket}"
  key    = "env/APP_NAME"
}

Output

Error refreshing state: 1 error(s) occurred:

* data.aws_s3_bucket_object.aws-remote-state: Failed getting S3 object: BucketRegionError: incorrect region, the bucket is not in 'us-west-2' region
        status code: 301, request id:

Actual Behavior

I expected it to be able to read the S3 object. But the actual behavior is that I get the BucketRegionError because var.bucket_name is a bucket in us-east-1, while the aws provider is pointed to us-west-2 .

dynamike commented 7 years ago

any update on this?

alshabib commented 7 years ago

Commented here as well.

You can actually do this by using Multiple Provider Instances. See example below:

Define a second provider:

provider "aws" {
  alias      = "my-other-provider"
  region     = "us-east-1"
}

Use this provider in your data source:

data "aws_s3_bucket_object" "app" {
  provider = "aws.my-other-provider"
  bucket = "${var.bucket_name}"
  key    = "env/APP_NAME"
}
Laxman-SM commented 6 years ago

@alshabib I tried your suggested solution. But this seems not working for me. i tried multi provider configuration, created new bucket ( due recycle of bucket take 1 hour i read somewhere ),

my terraform version 0.11.7

provider.tf

provider "aws" { region = "${var.region}" }

provider "aws" { alias = "oregon" region = "us-west-2" }

provider "aws" { alias="virginia" region="us-east-1" }

bucket.tf

resource "aws_s3_bucket" "segment_logs" { bucket = "${var.project}-${var.environment}-terra-logs" acl = "private" provider = "aws.virginia" }

backend.tf

terraform { required_version = ">=0.11.0" backend "s3" { bucket = "tfstates-dockerswarm" key = "laxman1/terraform.tfstate" region = "us-west-2" encrypt = true profile = "default" //dynamodb_table = "tfstates-dockerswarm1-lock-lock" //kms_key_id = "arn:aws:kms:us-east-1:117432468177:key/xxxxxxx" //workspace_key_prefix = "terraform-state"

} }

data "terraform_remote_state" "state" { backend = "s3"

config = { provider = "aws.oragon" region = "us-west-2" buknet = "tfstates-dockerswarm" key = "laxman1/terraform.tfstate" //skip_region_validation = "true" profile = "default" //dynamodb_table = "tfstates-dockerswarm1-lock-lock" //kms_key_id = "arn:aws:kms:us-west-2:117432468177:key/xxxxxx" //workspace = "${terraform.workspace}" } }

can any one help here. getting same error:-

Error: Error refreshing state: 1 error(s) occurred:

alshabib commented 6 years ago

Seems like you define your remote state bucket in us-west-2 but AWS says it is not there. You may want to recheck your configuration.

ghost commented 4 years ago

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.

If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.