gulpjs / glob-stream

Readable streamx interface over anymatch.
MIT License
178 stars 53 forks source link

How to properly read chunks of all files content into a single stream? #95

Closed Ajaxy closed 6 years ago

Ajaxy commented 6 years ago

I'd love to not only get chunks with filenames, but also read all files content by chunks (i.e. with fs.createReadStream) and send chunks subsequently into a single common output stream. Wondering what would be the right and shortest way to do so? Thanks.

Now I'm doing it like this:

const outputStream = through.obj();
const filenamesStream = gs(`${this.dirName}/*`);

filenamesStream.on('end', () => { outputStream.end(); })
filenamesStream.pipe(through2.obj(({ path }, _, cb) => {
    const fileStream = fs.createReadStream(path, { encoding: 'utf-8' });
    fileStream.on('end', () => { cb(null); });
    fileStream.pipe(outputStream, { end: false });
}));

return outputStream;

but it's quite ugly.

phated commented 6 years ago

The way this was solved from the gulp perspective was attaching the fileStream as a contents property in an object and passing that through the original stream (in this case, filenameStream). I wouldn't recommend that approach since it has caused us all sorts of trouble; however, you can check out the implementation in vinyl-fs if you want.

Ajaxy commented 6 years ago

Thanks

yocontra commented 6 years ago

@Ajaxy This might be what you're looking for: https://github.com/teambition/merge2

Pseudo-code:

const outputStream = merge2();
const filenamesStream = gs(`${this.dirName}/*`);

filenamesStream.on('data', (path) =>
  outputStream.add(fs.createReadStream(path, { encoding: 'utf-8' }))
)

return outputStream;

Assuming you want to take every file in a folder and put the contents in a stream in sequence.