Closed flyingzumwalt closed 8 years ago
I've cleaned this up a bit and got it working so that the tests can write stuff into the hypercore feed and then make assertions about the feed's contents.
Anyone want to work on this? @mels065 @CameronSima @benjaminettori ?
This still isn't parsing the CSV correctly. I tried using parse-input-stream, but it seems like that isn't actually parsing the contents.
To do
I can take a look.
I recommend merging this PR to master before proceeding with other tickets, so people can pull down the merged master and use this code as a starting point.
PRs should be reviewed by 2 people before merging. If you're the first reviewer and everything looks good, add a :+1: in the commends. Then the second reviewer can merge it if everything looks good.
@mels065 @CameronSima @benjaminettori @clamorisse @akireserrano86
:+1:
The results of this evening's work. It's almost working!
The tests are not passing, but it's almost there. We got stuck on figuring out how to get everything to fire in the right order.
This code is on the
import_stream
branch of the main jawn git repository, so you can get a copy of the code by runningThat should check out a local branch called import_stream that has the latest code from that branch on github.
Options to Consider
We aren't quite using streams properly. For ideas, consider the dat 2015 beta code.
lib/import.js This file exports a function which returns a duplex stream.
It uses pumpify to create a duplex stream that pipes the input through two transforms and into the
writeStream
(in our case, the writeStream would be the hypercore writeStream.bin/import.js This binary executable (triggered when you call
datbeta import
on the command line) uses the duplex stream from lib/import.js. It sets up the importer duplex stream like this:And then uses pump to pipe an input stream into the
importer
on line 68