Open felicemarra opened 4 years ago
In the scratch folder you will find stream_mp4.js. You should be able to take that and adjust it to make it do what you want to. You may have to add to the filters to set/adjust the timestamps if the two sources either don't have them or they don't line up. Good luck!
Mmm.. But I have a live feed from webrtc source... in the stream_mp4.js file it works with a mov file as source...
I need to push frames like examples/make_mp4.js
with audio... not only video
Ah. WebRTC and FFmpeg don't appear to mix at the moment. You should be able to find what you need by looking in the beamstreams.js file at the top level. There's a fair amount of node streams going on in there but the basics of setting up the encoders and the muxer etc should be clear enough (near the bottom of the file).
Probably now I understand what you mean
I try, thanks
How can I use a stream of raw frames? The example works with a recorded file giving a url but I need to use a stream (windows)
sources: [ { url: urls[0], ms: spec, streamIndex: 0 } ],
If you take a look from line 433 of the README.md file you will find a discussion of how to set up a demuxer stream. You need to find the right raw video demuxer (which may be near enough a nop) and then create a node ReadStream in order to pipe your raw frames to the demuxer. Instead of the url
parameter for sources
you pass an input_stream
parameter set to your ReadStream. There is a fragment of an example for audio in that section but the video version should be the same pattern but you will have to set up different options to populate the demuxer metadata.
I have raw frames in yuv420p and audio chunks in pcm. I need to generate a mp4 file of this raw video with audio
The example folder contains encoding just for video .. muxing just for audio..
Someone can share ad example to mix audio and video from raw frames and chunks to generate an mp4 file?
Thank you