Closed piranna closed 9 years ago
Another option would be to write to disk each of the files each time it get decompressed.
Use --max_old_space_size
flag of node
command.
At least this should be discussed in https://github.com/cscott/seek-bzip/issues.
When trying to download http://gd.tuwien.ac.at/gnu/gcc/snapshots/5-20150616/gcc-5-20150616.tar.bz2 I've got the next exception:
Seems to me that's trying to decompress it in memory at once instead of stream it chunk by chunk.