Open marcosmartinezfco opened 4 days ago
Do you have the script or terraform handy used to create the DynamoDB table? I've not seen this error in use of this code, so I'm curious if tit's possible to configure DynamoDB in a way that triggers this error
@rtyler It's a simple module that we have to create dynamo tables.
module "market_updates_lock_table" {
source = "../../utils/dynamodb"
table_name = "delta_log"
hash_key = "tablePath"
attributes = [
{
name = "tablePath"
type = "S"
}
]
}
---
# "../../utils/dynamodb"
resource "aws_dynamodb_table" "this" {
name = var.table_name
hash_key = var.hash_key
range_key = var.range_key
dynamic "attribute" {
for_each = var.attributes
content {
name = attribute.value.name
type = attribute.value.type
}
}
billing_mode = var.billing_mode
read_capacity = 1
write_capacity = 1
dynamic "global_secondary_index" {
for_each = var.secondary_index != null && var.secondary_index_hash_key != null ? [1] : []
content {
name = var.secondary_index
hash_key = var.secondary_index_hash_key
non_key_attributes = var.secondary_index_non_key_attributes
projection_type = "INCLUDE"
write_capacity = 1
read_capacity = 1
}
}
}
And this is the item in the dynamo
Environment
Delta-rs version: 0.18.2
Binding: Python
Environment:
Bug
What happened:
Encountered errors when trying to write to a Delta Lake table stored in S3 with DynamoDB used for transaction logs. The errors indicate failures to write to DynamoDB for transaction entries.
Error log:
What you expected to happen:
Expected the Delta Lake write operation to complete successfully, with transaction entries properly written to DynamoDB.
How to reproduce it:
Example script to reproduce the issue:
More details:
I have confirmed that I have admin permissions on my AWS account, so permissions should not be an issue. This problem persists even with the correct configurations in place. Funny enough the operation append the item to the table so the error is not affecting the write apparently.