Closed grayaii closed 7 years ago
I am not seeing any data in s3, but I don't see any errors anywhere.
This is how I build/deploy and create a dummy table:
cd src ./build.sh alex_02.hjson python deploy.py --redeploy --config-file alex_02.hjson aws dynamodb create-table --region us-west-2 --attribute-definitions \ AttributeName=MyHashKey,AttributeType=S --key-schema \ AttributeName=MyHashKey,KeyType=HASH --provisioned-throughput \ ReadCapacityUnits=1,WriteCapacityUnits=1 --table-name alex02_foobar6
The lambda function gets fired and the output looks correct:
I then manually created some items via the AWS Consul:
But the S3 bucket is empty.
The stream for my table looks like there is some data that is going through:
Would could be the issue? Or better yet, how can I debug this?
OK. I figured it out. It was a permissions issue with firehose and the bucket.
I am not seeing any data in s3, but I don't see any errors anywhere.
This is how I build/deploy and create a dummy table:
The lambda function gets fired and the output looks correct:
I then manually created some items via the AWS Consul:
But the S3 bucket is empty.
The stream for my table looks like there is some data that is going through:
Would could be the issue? Or better yet, how can I debug this?