Closed Murmur closed 1 year ago
Hello, first of all discussions section is more suitable for such questions.
If ffmpeg command you provided works for you, then you can use Jaffree to run it. Otherwise you can use Jaffree to generate video stream programmatically:
FrameInput
which generates BufferedImage
per each frame. It should produce single video stream (i.e. track) without audioOptionally you can use ffmpeg filters to produce several output files with different resolution or single file with several video stream of different resolutions.
Thanks, I did not realize Discussion
was an active tabsheet in this project, so may github project does not use it.
Yes my ffmpeg cmdline works fine with predefined png images. I give it a try using FrameInput (BufferedImage ARGB)
as a second input to create argb transparent image buffers at runtime.
Writing a massive amount of png images per video resolution to a SDD disk, possibly recreating few times and multiple files did not sound good plan. Jaffree libaray might give me a good flexible solution.
Are you sure you need transparency? Not all players can play it correctly. Usually it's needed when you want to overlay video on top of something else in runtime.
>>are you suer you need transparency?
I mean my BufferedImage ARGB
frame has a transparency but the final output video is a regular video file, no multiple layers no transparent video format.
I had an impression second FrameInput
input is my customized java class to submit in-memory adhoc ARGB image per incoming frame, ffmpeg blends first and second input as if I did in a standalone cmdline with static png images.
So do you mean you want to add an overlay on top of original video? If so I would recommend running 2 instances of Jaffree:
FrameOutput
which sends BufferedImage
s to concurrent queueFrameInput
to pass overlayed images to ffmpeg2 instances of Jaffee
: Is this something can do in a single app such as two threads without two processes and no pipe file or localhost tcp streams?
concurrent queue
: do you mean Java collections thread-safe queue?
thread 1: grap BufferedImage RGB frames from source.mp4, add to queue | thread2: take frame from queue, draw overlay graphics, push BufferedImage to FrameInput -> ffmpeg encoding params to output.mp4
2 instances of Jaffee: Is this something can do in a single app such as two threads without two processes and no pipe file or localhost tcp streams?
Yes, you can run 2 instances in the same JVM instance. Just not forget that you need either to start in different threads, or need to use executeAsync()
method.
concurrent queue: do you mean Java collections thread-safe queue?
yes
thread 1: grap BufferedImage RGB frames from source.mp4, add to queue | thread2: take frame from queue, draw overlay graphics, push BufferedImage to FrameInput -> ffmpeg encoding params to output.mp4
You've got the idea. Don't forget that you need to stop second ffmpeg gracefully (otherwise target file may be corrupted). So you need to send null
with FrameInput
to signal Jaffree there is no more frames.
(I was browsing examples but not sure about the following scenario) Draw BufferedImage(ARGB) on video frames I want to create an image per video frame (QRCode with framenum+time+resolution, header text with a framenum+time+resolution, colorbox). QRCode+Colorbox should jump a little on every frame such as rotate inside the defined region.
Source file is "any" mp4(h264,h265) video file with a single audio track. I did Java tool to write
image-000001.png ... N.png
images with corrent per frame content and then ffmpeg cmdline to overlay video+png images.This works for a short video fine and no need to recreate content too many times.
Is there an example to draw
BufferedImage ARGB
at runtime filter and overlay in a input video file? Speed is not a big concern.temp-360p.mp4/-720p.mp4/-1080p.mp4
track files.ffmpeg cmdline video+frame png overlay