kevva / download

Download and extract files
MIT License
1.28k stars 200 forks source link

"Buffer larger than maximum size" exception #66

Closed piranna closed 9 years ago

piranna commented 9 years ago

When trying to download http://gd.tuwien.ac.at/gnu/gcc/snapshots/5-20150616/gcc-5-20150616.tar.bz2 I've got the next exception:

buffer.js:71
    throw new RangeError('Attempt to allocate Buffer larger than maximum ' +
          ^
RangeError: Attempt to allocate Buffer larger than maximum size: 0x3fffffff bytes
    at new Buffer (buffer.js:71:11)
    at outputStream.writeByte (/home/piranna/Proyectos/NodeOS/node_modules/download/node_modules/gulp-decompress/node_modules/decompress/node_modules/decompress-tarbz2/node_modules/seek-bzip/seek-bzip/index.js:474:23)
    at Bunzip._read_bunzip (/home/piranna/Proyectos/NodeOS/node_modules/download/node_modules/gulp-decompress/node_modules/decompress/node_modules/decompress-tarbz2/node_modules/seek-bzip/seek-bzip/index.js:430:25)
    at Function.Bunzip.decode (/home/piranna/Proyectos/NodeOS/node_modules/download/node_modules/gulp-decompress/node_modules/decompress/node_modules/decompress-tarbz2/node_modules/seek-bzip/seek-bzip/index.js:508:10)
    at DestroyableTransform._transform (/home/piranna/Proyectos/NodeOS/node_modules/download/node_modules/gulp-decompress/node_modules/decompress/node_modules/decompress-tarbz2/index.js:59:19)
    at DestroyableTransform.Transform._read (/home/piranna/Proyectos/NodeOS/node_modules/download/node_modules/through2/node_modules/readable-stream/lib/_stream_transform.js:184:10)
    at DestroyableTransform.Transform._write (/home/piranna/Proyectos/NodeOS/node_modules/download/node_modules/through2/node_modules/readable-stream/lib/_stream_transform.js:172:12)
    at doWrite (/home/piranna/Proyectos/NodeOS/node_modules/download/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:237:10)
    at writeOrBuffer (/home/piranna/Proyectos/NodeOS/node_modules/download/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:227:5)
    at DestroyableTransform.Writable.write (/home/piranna/Proyectos/NodeOS/node_modules/download/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:194:11)

Seems to me that's trying to decompress it in memory at once instead of stream it chunk by chunk.

piranna commented 9 years ago

Another option would be to write to disk each of the files each time it get decompressed.

shinnn commented 9 years ago

Use --max_old_space_size flag of node command.

At least this should be discussed in https://github.com/cscott/seek-bzip/issues.