bevacqua / shots

:gun: pull down the entire Internet into a single animated gif.
https://ponyfoo.com
MIT License
728 stars 15 forks source link

Get "Error: stderr maxBuffer exceeded" running in cli #13

Open nujabse opened 8 years ago

nujabse commented 8 years ago

I tried "shots amazon.com -o ~/Documents/shots " , waited for a while , and got the following error when the program finished: { [Error: stderr maxBuffer exceeded] cmd: 'waybackpack amazon.com -d /home/mathholic/Documents/shots/pages --start 1800' }

Is there any help for this?

davisshaver commented 8 years ago

I ran into this too. I thought limiting the concurrency would help, but no dice.

davisshaver commented 8 years ago

I've worked around this by just re-running the command.

rileyjshaw commented 8 years ago

With -v:

./cli khanacademy.org -o ~/Desktop/ -v
  shots dest directory set to: /Users/rileyshaw/Desktop/ +0ms
  shots pmkdirp(pages) +3ms
  shots pmkdirp(screenshots) +1ms
  shots pmkdirp(diffs) +0ms
  shots pmkdirp(output) +0ms
  shots rglob('/Users/rileyshaw/Desktop/pages/*.html') +4ms
  shots resolving download stage +0ms
  shots waybackpack khanacademy.org -d /Users/rileyshaw/Desktop/pages --start 1800 +2ms
  shots ERR! Error: stderr maxBuffer exceeded +31m
{ Error: stderr maxBuffer exceeded
    at Socket.<anonymous> (child_process.js:274:14)
    at emitOne (events.js:96:13)
    at Socket.emit (events.js:188:7)
    at readableAddChunk (_stream_readable.js:172:18)
    at Socket.Readable.push (_stream_readable.js:130:10)
    at Pipe.onread (net.js:542:20)
  cmd: 'waybackpack khanacademy.org -d /Users/rileyshaw/Desktop/pages --start 1800' }

Ran from 12:11pm to 12:42pm, 1069 documents downloaded to pages before the error.

StevenACoffman commented 8 years ago

I run it endlessly like this:

while :; do DEBUG=shots node amazon.js ; done

Where amazon.js is:

'use strict';
var shots = require('shots').default;

shots({
  dest: '/Users/scoffman/shots/sobig/resources/shots',
  site: 'www.amazon.com',
  pageres: {
    crop: true,
    scale: 1,
    sizes: ['1024x768']
  },
  tolerance: 80,
  stages: {
    download: true,
    screenshots: true,
    diffing: true,
    filmstrip: true,
    gif: true
  }
});

This works around the problem, but there are many, many failures, so it still takes 1-2 days on my machine.

ghost commented 8 years ago

I have this issue too. Manage to get sites with very short histories, but any sites actually worth doing a GIF of will run into this issue.

ghost commented 8 years ago

@davisshaver does this continue from where it left off?