Open keen99 opened 7 years ago
@keen99 did you make any progress on this? I haven't used this feature before.
/cc @rclark
not really - all I managed to confirm through hackery was that the first line is all that ever gets read in import
, and in put
the lines get read but Importer
is never called. (roughly. it's been a long long long few days since I was in there.)
Yeah there's definitely at least a few things wrong:
It's possible that the pause/resume logic flow here doesn't work like it was intended to. This could cause the data records in an import
operation to "vanish".
DynamoDB Stream information needs to be removed from export
output prior to an import
command. This ought to be done in the cleanDescription function
The Parser()
is doing some kinda screwball something to the first line in the data it receives. Firing the callback
prior to the return
here makes the dyno put
work the way it should, but still didn't get information across on export/import.
thanks! firing the callback before Parser
's firstline return
s got put
working for me (as long as the first line isn't schema)
if (firstline) {
firstline = false;
var parsed = Dyno.deserialize(record);
setImmediate(callback);
if (!Object.keys(parsed).every(function(key) {
return !!parsed[key];
})) return this.push(JSON.parse(record.toString()));
}
so with this I can at least create from a schema definition and load data - since I already have a custom "export" wrapper tool that does just schema dumps (https://github.com/mapbox/dynamodb-replicator/issues/85), I'll probably just add modified versions of the import/put tools to that to get past my restore hurdles for dynamodb-replicator
for now.
thanks!
well, looks like put
in the above case only works for 2550 lines of input - on 2551 (with some light variation testing, it seems to always be the 2551th line)
cheese:~/git/github/dynamodb-replicator(dsr-hackery)%% cat $TABLE.out.json | tail -n2551 | node bin/dyno.js put $AWS_REGION/dsr-test-restore-$TABLE
Error: no writecb in Transform class
at afterTransform (_stream_transform.js:71:33)
at TransformState.afterTransform (_stream_transform.js:54:12)
at Transform.Importer.importer._transform (/Users/draistrick/git/github/dynamodb-replicator/bin/dyno.js:211:7)
at runCallback (timers.js:576:20)
at tryOnImmediate (timers.js:550:5)
at processImmediate [as _immediateCallback] (timers.js:529:5)
cheese:~/git/github/dynamodb-replicator(dsr-hackery)%% cat $TABLE.out.json | tail -n2550 | node bin/dyno.js put $AWS_REGION/dsr-test-restore-$TABLE
cheese:~/git/github/dynamodb-replicator(dsr-hackery)%%
ok, I've been banging my head on this too long now.
Simple summary -
import
(andput
) dont actually load data into ddb.I've also tested this with
put
(removing the first line that contains the table schema from the export) after creating a table, with the same result. empty stdout/stderr, and no data added to ddb.running v1.4.1 code:
The tests added in #117 for #96 seem to pass:
and using the direct code (vs the npm installed package) has the same result: