shevchenkos / DynamoDbBackUp

46 stars 23 forks source link

We need this baby to go waaay faster ;) #34

Open sokoow opened 7 years ago

sokoow commented 7 years ago

This thing doesn't handle throttling and anything over 1060 simultaneous records to be pulled, here's the session with 2000 records at the time:

14:49:51] Using gulpfile ~/git/DynamoDbBackUp/gulpfile.js
[14:49:51] Starting 'backup-full'...
Got key schema [{"AttributeName":"userkey","KeyType":"HASH"}]
Retrieved 1060 records; total at 1060 records.
[14:49:57] Finished 'backup-full' after 5.77 s
Retrieved 1060 records; total at 2120 records.
Retrieved 1060 records; total at 3180 records.
Retrieved 1060 records; total at 4240 records.
Retrieved 1060 records; total at 5300 records.
Retrieved 1060 records; total at 6360 records.
(node:3936) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): ProvisionedThroughputExceededException: The level of configured provisioned throughput for the table was exceeded. Consider increasing your provisioning level with the UpdateTable API.
(node:3936) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
shevchenkos commented 7 years ago

Hi, @sokoow . For me it looks like you exceeded your maximum allowed provisioned throughput for a table. I see 3 scenarios:

sokoow commented 7 years ago

Well yeah, but this this shouldn't just crash right ?

shevchenkos commented 7 years ago

@sokoow, yes you are right. For me the best scenario is to notify that there is such a problem, don't stop the process and try later.