Open kamrapooja opened 1 month ago
We can easily accomplish this with the ffmpeg program: http://bytedeco.org/javacpp-presets/ffmpeg/apidocs/org/bytedeco/ffmpeg/ffmpeg.html
We are able to merge the videos side by side using ffmpeg command. But with javacv we are not able to do it. In the attachment, example is given to use ffmpeg command. Could you pls help how to get the merged video using javacv functions? When we are using :FFmpegFrameFilter filter = new FFmpegFrameFilter( "[0:v][1:v]hstack=inputs=2[v]", "[0:a][1:a]amerge[a]", grabber.getImageWidth(), grabber.getImageHeight(), grabber.getAudioChannels());
It is giving error avfilter_graph_config() with errer -22.
I have two webm files a.webm and b.webm having same dimensions and framerate. Purpose is to merge both videos side by side horizontally and get MP4 as output. For the same, I took reference code given in https://github.com/bytedeco/javacv/issues/955 post. But that did not work. I am not able to understand how the below code snippet working: while ((frame2 = grabber.grab()) != null) { if (frame2.image != null) { a++; } if (frame2.samples != null) { b++; } filter.push(0, frame2); filter.push(1, frame2); Frame frame3; while ((frame3 = filter.pull()) != null) { if (frame3.image != null) { c++; assertEquals(640, frame3.imageWidth); assertEquals(200, frame3.imageHeight); assertEquals(3, frame3.imageChannels); } if (frame3.samples != null) { d++; assertEquals(2, frame3.audioChannels); assertEquals(1, frame3.samples.length); assertTrue(frame3.samples[0] instanceof ByteBuffer); assertEquals(frame2.samples.length, frame3.samples.length); assertEquals(frame2.samples[0].limit(), frame3.samples[0].limit()); } } }
Please assist for the same.