hashicorp / terraform-provider-aws

The AWS Provider enables Terraform to manage AWS resources.
https://registry.terraform.io/providers/hashicorp/aws
Mozilla Public License 2.0
9.86k stars 9.21k forks source link

[Enhancement]: DynamoDB does not recognize that S3 import does not allow local secondary indices #39875

Open pkgw opened 1 month ago

pkgw commented 1 month ago

Terraform Core Version

1.9.8

AWS Provider Version

5.72.1

Affected Resource(s)

aws_dynamodb_table

Expected Behavior

If you specify a DynamoDB table with both local_secondary_index and import_table blocks, the provider should report an error indicating that these two features cannot be combined. This behavior is barely documented, but can be seen in the web UI, e.g.:

https://repost.aws/questions/QUHtCwkvOzRDaYdIq7TKbrLA/dynamodb-import-from-s3-does-not-support-localsecondaryindexes-bug-or-intended-behaviour

Actual Behavior

Terraform will attempt to create the resource, leading to the following hard-to-understand error:

Error: creating AWS DynamoDB Table ($NAME): operation error DynamoDB: ImportTable, https response error StatusCode: 400, RequestID: $ID, api error ValidationException: One or more parameter values were invalid: Number of attributes in KeySchema does not exactly match number of attributes defined in AttributeDefinitions

This is what you get if you properly list both your primary keys and the key(s) associated with the LSI(s) as attribute items on the resource, at least.

Relevant Error/Panic Output Snippet

No response

Terraform Configuration Files

resource "aws_dynamodb_table" "mytable" {
  name         = "mytable"
  hash_key     = "hashKey"
  range_key    = "rangeKey"
  billing_mode = "PAY_PER_REQUEST"

  attribute {
    name = "hashKey"
    type = "S"
  }

  attribute {
    name = "rangeKey"
    type = "S"
  }

  attribute {
    name = "secondaryKey"
    type = "S"
  }

  local_secondary_index {
    name            = "secondaryIndex"
    range_key       = "secondaryKey"
    projection_type = "ALL"
  }

  import_table {
    input_compression_type = "GZIP"
    input_format           = "DYNAMODB_JSON"

    s3_bucket_source {
      bucket     = "mybucket"
      key_prefix = "mytable"
    }
  }
}

Steps to Reproduce

  1. Declare a DynamoDB resource using both of these features.
  2. Attempt to create it.

Debug Output

No response

Panic Output

No response

Important Factoids

No response

References

I can't find any DynamoDB documentation that explicitly specifies this constraint, but if you go to the web UI and run through the wizard to manually initiate an S3 important, you'll see the alert box as screenshotted in the re:Post link above.

Would you like to implement a fix?

No

github-actions[bot] commented 1 month ago

Community Note

Voting for Prioritization

Volunteering to Work on This Issue

justinretzolk commented 1 month ago

Hey @pkgw 👋 Thank you for taking the time to raise this! For something like this, where we're adding additional functionality to existing resources (in this case, additional validation), we'd consider this an enhancement rather than a bug. I'm going to update a couple of things about this report with that in mind. No further action is needed from you at this point, I just like to let people know before I make those kinds of modifications.