Closed wdarsono closed 10 years ago
Do you mean it list the wrong columns (not all the columns? how many?)? How many rows does it list?
It's hard for me to replicate this without significant cost (of transfer & storage and time to set up such a table). If you'd like to give me read only access to your table, I can try to debug.
Hi Erik,
1). Apparently my table is too big to be scanned once (bigger than 1MB), therefore I needed to scan the table multiple times. Please refer to the following site: "http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/QueryAndScan.html" Please refer to section "Paginating the Results".
2). Depending on how the program concatenates the strings to produce .csv file (either save to .csv file every row, or save to .csv file in one time after the whole string constructed), a "messy" table cell content that contains ", like: "test1 ... , ...test2" can cause problem too (in my case).
You can close this case.
Many thanks and best regards, Willie
Hi Erik,
I am new to DynamoDB and nodejs.
I tried to convert a DynamoDB table of 1426 rows * 204 columns into a csv file. When I used the following command: node dynamoDBtoCSV.js -t table1 -d it returns the correct ItemCount: 1426. It also returns TableSizeBytes: 1335111 bytes (which I am not sure how to verify this for DynamoDB table).
However, when I used the following command: node dynamoDBtoCSV.js -t table1 > output.csv The output.csv file created doesn't contain the correct table structure. I have tried to change the "Limit" value into 1000000 (initially was 1000) inside dynamoDBtoCSV.js file as follows: var query = { "TableName": program.table, "Limit": 1000000, }; but the produced output.csv is still wrong (as before).
Is there any limit in many rows / columns or table size that can be converted? Can you please advise how to handle this issue.
Many thanks, Willie