twolfson / gif-encoder

Streaming GIF encoder
The Unlicense
87 stars 10 forks source link

about GIF memory limit exceeded #10

Closed SSShooter closed 7 years ago

SSShooter commented 7 years ago

I'm a novice of nodejs and I have some question about GIF memory limit exceeded

events.js:160
      throw er; // Unhandled 'error' event
      ^

Error: GIF memory limit exceeded. Please `read` from GIF before writing additional frames/information.
    at GIFEncoder.ByteCapacitor.flushData (C:\Users\home\Desktop\pixivDLerNode\node_modules\gif-encoder\lib
\GIFEncoder.js:43:15)
    at GIFEncoder.flushData (C:\Users\home\Desktop\pixivDLerNode\node_modules\gif-encoder\lib\GIFEncoder.js
:110:10)
    at emitNone (events.js:86:13)
    at GIFEncoder.emit (events.js:185:7)
    at GIFEncoder.addFrame (C:\Users\home\Desktop\pixivDLerNode\node_modules\gif-encoder\lib\GIFEncoder.js:
216:8)
    at zipEntries.forEach (C:\Users\home\Desktop\pixivDLerNode\index.js:65:9)
    at Array.forEach (native)
    at makeGif (C:\Users\home\Desktop\pixivDLerNode\index.js:62:14)
    at WriteStream.<anonymous> (C:\Users\home\Desktop\pixivDLerNode\index.js:23:5)
    at emitNone (events.js:91:20)
    at WriteStream.emit (events.js:185:7)
    at fs.js:2037:14
    at FSReqWrap.oncomplete (fs.js:123:15)
SSShooter commented 7 years ago

code

function makeGif(id) {
  var GifEncoder = require('gif-encoder');
  var getPixels = require("get-pixels");
  var jpeg = require('jpeg-js')
  var file = fs.createWriteStream(id + '.gif');
  var zip = new AdmZip(id + '.zip');
  var zipEntries = zip.getEntries();
  var wh = jpeg.decode(zip.readFile(zipEntries[0]));
  var gif = new GifEncoder(wh.width, wh.height);
  gif.read(100000000);
  gif.pipe(file);
  gif.writeHeader();
  zipEntries.forEach((zipEntry) => {
    var data = zip.readFile(zipEntry);
    var pixels = jpeg.decode(data).data;
    gif.addFrame(pixels);
  });
  gif.finish();
};
twolfson commented 7 years ago

The issue is you're buffering up new frames without outputting them anywhere (i.e. calling gif.addFrame(). This message is telling you to start outputting the stream somewhere (i.e. call gif.read(), gif.pipe(), or gif.on('data')) so it can free up buffer space.

We set the default maximum buffer to 64kb:

https://github.com/twolfson/gif-encoder/blob/0.6.0/lib/GIFEncoder.js#L73-L79

It can be adjusted via options.highWaterMark as documented here:

https://github.com/twolfson/gif-encoder/tree/0.6.0#new-gifencoderwidth-height-options

twolfson commented 7 years ago

I was going to write an example about calling .read() first but your code seems to be properly ordered. I'm guessing you're running into a buffer size issue (i.e. over 64kb per frame) so you'll need to pass in your own limit:

var gif = new GifEncoder(wh.width, wh.height, {
  highWaterMark: 5 * 1024 * 1024 // 5MB
});
SSShooter commented 7 years ago

Thank you for your answer.

What's the relation between 'highWaterMark' and 'read'? And What read() actually do?

SSShooter commented 7 years ago

.....it work but i am confused

  var GifEncoder = require('gif-encoder');
  var getPixels = require("get-pixels");
  var jpeg = require('jpeg-js')
  var file = fs.createWriteStream(id + '.gif');
  var zip = new AdmZip(id + '.zip');
  var zipEntries = zip.getEntries();
  var wh = jpeg.decode(zip.readFile(zipEntries[0]));
  var gif = new GifEncoder(wh.width, wh.height, {
    highWaterMark: 50 * 1024 * 1024 
  });
  gif.pipe(file);
  gif.writeHeader();
  zipEntries.forEach((zipEntry) => {
    gif.read(1024 * 1024);
    console.log(zipEntry.name);
    var data = zip.readFile(zipEntry);
    var pixels = jpeg.decode(data).data;
    gif.addFrame(pixels);
  });
  console.log('addFrame finish');
  gif.read(1024 * 1024);
  gif.finish();
twolfson commented 7 years ago

highWaterMark sets the upper bound of our internal buffer for our GIFEncoder stream. .read() is one way to output content from the internal buffer, thus freeing up space for more frames.

This is based on Node.js' transform stream so more information can be found in its documentation:

https://nodejs.org/api/stream.html

SSShooter commented 7 years ago

Thanks again !

nurpax commented 6 years ago

IMO this was prematurely closed. It’s easy to hit the buffer size limit and it’s not clear from the docs how to deal with this.

Since the original reporter is piping the gifencoder output into a file stream, why are the contents of the internal buffer not flushed to the file when the buffer is full? I’d expect this to happen automatically. The docs say I should read() to flush the buffer but I don’t see why a read makes sence given I’m only outputting to a writable stream.