fluent-ffmpeg / node-fluent-ffmpeg

A fluent API to FFMPEG (http://www.ffmpeg.org)
MIT License
7.89k stars 877 forks source link

Generating screenshot from stream #352

Open gabrielstuff opened 9 years ago

gabrielstuff commented 9 years ago

Hi !

I'm trying to generate a series of screenshot from a readable stream using this :

var ffmpeg = require('fluent-ffmpeg'),
    fs = require('fs');

var command = 
ffmpeg(fs.createReadStream('./good_blink.mp4'))
  .on('end', function(files) {
    console.log('screenshots were saved as ' + files);
  })
  .on('error', function(err) {
    console.log('an error happened: ' + err.message);
  })
  .screenshots({
    timestamps: [0],
    filename: 'thumbnail-at-%s-seconds.png',
    folder: './thumbs',
    size: '320x240'
  });

Unfortunately the output is empty. If I run the same example with direct file input it works.

var ffmpeg = require('fluent-ffmpeg'),
    fs = require('fs');

var command = 
ffmpeg('./good_blink.mp4')
  .on('end', function(files) {
    console.log('screenshots were saved as ' + files);
  })
  .on('error', function(err) {
    console.log('an error happened: ' + err.message);
  })
  .screenshots({
    timestamps: [0],
    filename: 'thumbnail-at-%s-seconds.png',
    folder: './thumbs',
    size: '320x240'
  });

I do respect the non-percent usage, nonetheless I do not get why it broke...

Thanks !

njoyard commented 9 years ago

Yeah, the default screenshots implementation will not work with streams, because it has to fetch metadata first (to be able to resize) and thus "consumes" data from the stream.

You may be able to adapt it to your needs... And I should document this problem. When I have some spare time, I'll try to do a stream-friendly implementation.

gabrielstuff commented 9 years ago

Hey,

Just using https://www.npmjs.com/package/faststart fixed my issue. In the mean time I end up rewriting the full command :)

Thanks for your help ! And yes add it to the doc. Order of metadata for mp4 screenshot / thumbnails, is a big information.

var readStream = faststart.createReadStream(videoName),
writeStream = fs.createWriteStream('./woop.jpg');
writeStream.on('error', function(err) {
  console.log('stream err: ', err);
});

var ffmpeg = child_process.spawn("ffmpeg", [
  '-i', 'pipe:0', // path
  '-vf', 'thumbnail,scale=640:360', // bitrate to 64k
  '-r', '1',
  '-frames:v', '1',
  '-f', 'mjpeg',
  'pipe:1'
]);
ffmpeg.on('error', function(err) {
  console.log(err);
})
ffmpeg.stderr.on('data', function(data) {
  console.log('stderr: ' + data);
})
ffmpeg.on('close', function(code) {
  console.log('child process exited with code ' + code);
});

writeStream.on('error', function(err) {
  console.log('stream err: ', err);
});

readStream.pipe(ffmpeg.stdin)
.on('error', function(err){
  console.log('error pipe in:', err);
});

ffmpeg.stdout.pipe(writeStream)
.on('error', function(err){
  console.log('error while writing: ',err);
});
njoyard commented 9 years ago

BTW, this is not a good idea at all to use two pipes between two processes on unix systems. You will end up having a deadlock sometime.

gabrielstuff commented 9 years ago

What do you suggest, how could I go from stdin to stdout without two pipe ?

njoyard commented 9 years ago

You can use two pipes but not between two processes. The way pipes work on unix systems does not prevent a situation when both processes will wait for each other to send data. That's a known limitation (and it sure is a shame because that would be very handy).

You may want to replace one of your pipes by a temp file, or you can keep using two pipes but between three processes:

input video fetching process ==> ffmpeg ==> output video handling process

In the end it all depends what you're getting the input from and what you're doing with the output. In your example above, you're writing to a jpeg file, so just don't use a stream for that.

gabrielstuff commented 9 years ago

ok got it. Unfortunately, this is a test case... I need to feed some framework with a readable stream which will be a jpg.

Thanks !

njoyard commented 9 years ago

If your output is "only" a jpeg, that's quite small, so you won't have much overhead by writing that jpeg to a file and then feed a fs.createReadStream() to the next stage !

ebuildy commented 9 years ago

I am using express to do a "ffmpeg as a service", here my "GET /screenshot" code, this working very fine with RTMP:

router.get('/screenshot', function(req, res)
{
    var streamID    = req.query.stream,
        stream      = streams[streamID],
        file        = '/var/snapshot-' + streamID + '.png';

    var proc = ffmpeg('rtmp://' + stream.origin + ':1935/' + stream.app + '/' + stream.stream)
        .on('end', function(files)
        {
            res.sendfile(file);
        })
        .on('error', function(err)
        {
            res.json({
                status : 'error',
                error : err.message
            });
        })
        .outputOptions(['-f image2', '-vframes 1', '-vcodec png', '-f rawvideo', '-s 320x240', '-ss 00:00:01'])
        .output(file)
        .run();
});
chmanie commented 8 years ago

@ebuildy Thanks for sharing! Your solution works quite fine for one screenshot. Is it possible to create 2 or more screenshots using a similar technique?

dminkovsky commented 4 years ago

two pipes between two processes on unix systems

Could someone confirm my understanding of the two pipes and the two processes issue here?

The two processes:

1) Node 2) Spawned ffmpeg

The two pipes:

1) Node read stream out piped to input of ffmpeg 2) ffmpeg output, piped back to a stream that is handled by Node?

Deadlock may occur because the buffers that back these pipes may fill up, and either Node or ffmpeg will be waiting on each other? A solution would be to have ffmpeg write the result to a file instead of back to Node, and then have Node read that file after it is created?

Bracketstech commented 2 days ago

I am facing the same problem, any updates on the screenshot taking from stream ?