kokorin / Jaffree

______ Stop the War in Ukraine! _______ Java ffmpeg and ffprobe command-line wrapper
Apache License 2.0
470 stars 80 forks source link

Draw BufferedImage(ARGB) on video frames? #351

Closed Murmur closed 1 year ago

Murmur commented 1 year ago

(I was browsing examples but not sure about the following scenario) Draw BufferedImage(ARGB) on video frames I want to create an image per video frame (QRCode with framenum+time+resolution, header text with a framenum+time+resolution, colorbox). QRCode+Colorbox should jump a little on every frame such as rotate inside the defined region.

Source file is "any" mp4(h264,h265) video file with a single audio track. I did Java tool to write image-000001.png ... N.png images with corrent per frame content and then ffmpeg cmdline to overlay video+png images.

This works for a short video fine and no need to recreate content too many times.

Is there an example to draw BufferedImage ARGB at runtime filter and overlay in a input video file? Speed is not a big concern.

ffmpeg cmdline video+frame png overlay

set input="/test/tos_1080p_25fps.mp4"
set inputimg="/test/1920x1080/image-%%06d.png"
set output="/test/tos_new.mp4"

@rem GOP/IDR interval 1.92s/0.96s, later use dash segdur 3.84s (fps is still 25fps)
"ffmpeg" -i %input% ^
  -framerate 25 -loop 1 -i %inputimg% -threads 4 -preset fast ^
  -c:v libx264 -crf 28 -profile:v high -level 4.0 -maxrate:v 4500k -bufsize:v 9000k ^
  -pix_fmt yuv420p -refs 3 -bf 3 -g 48 -keyint_min 24 -b_strategy 1 -flags +cgop -sc_threshold 0 ^
  -movflags negative_cts_offsets+faststart ^
  -color_range tv -colorspace bt709 -color_primaries bt709 -color_trc bt709 ^
  -sn -metadata:s:v:0 "language=eng" -metadata:s:a:0 "language=eng" ^
  -filter_complex "[0:v][1]overlay,scale=1920x1080:out_range=tv:out_color_matrix=bt709:flags=full_chroma_int+accurate_rnd,format=yuv420p,setsar=1/1[out]" ^
  -map "[out]" ^
  -c:a aac -b:a 128k -maxrate:a 128k -bufsize:a 128k -af aresample=48000 -ar 48000 -ac 2 ^
  -map "0:a" ^
  -y "%output%"
kokorin commented 1 year ago

Hello, first of all discussions section is more suitable for such questions.

If ffmpeg command you provided works for you, then you can use Jaffree to run it. Otherwise you can use Jaffree to generate video stream programmatically:

  1. The first input is media file with audio
  2. The second input is FrameInput which generates BufferedImage per each frame. It should produce single video stream (i.e. track) without audio
  3. The output is mp4 file as in the example you gave

Optionally you can use ffmpeg filters to produce several output files with different resolution or single file with several video stream of different resolutions.

Murmur commented 1 year ago

Thanks, I did not realize Discussion was an active tabsheet in this project, so may github project does not use it. Yes my ffmpeg cmdline works fine with predefined png images. I give it a try using FrameInput (BufferedImage ARGB) as a second input to create argb transparent image buffers at runtime.

Writing a massive amount of png images per video resolution to a SDD disk, possibly recreating few times and multiple files did not sound good plan. Jaffree libaray might give me a good flexible solution.

kokorin commented 1 year ago

Are you sure you need transparency? Not all players can play it correctly. Usually it's needed when you want to overlay video on top of something else in runtime.

Murmur commented 1 year ago

>>are you suer you need transparency? I mean my BufferedImage ARGB frame has a transparency but the final output video is a regular video file, no multiple layers no transparent video format.

I had an impression second FrameInput input is my customized java class to submit in-memory adhoc ARGB image per incoming frame, ffmpeg blends first and second input as if I did in a standalone cmdline with static png images.

kokorin commented 1 year ago

So do you mean you want to add an overlay on top of original video? If so I would recommend running 2 instances of Jaffree:

  1. The first reads original video with FrameOutput which sends BufferedImages to concurrent queue
  2. The seconds reads images from the queue adds an overlay and uses FrameInput to pass overlayed images to ffmpeg
Murmur commented 1 year ago

2 instances of Jaffee: Is this something can do in a single app such as two threads without two processes and no pipe file or localhost tcp streams? concurrent queue: do you mean Java collections thread-safe queue?

thread 1: grap BufferedImage RGB frames from source.mp4, add to queue | thread2: take frame from queue, draw overlay graphics, push BufferedImage to FrameInput -> ffmpeg encoding params to output.mp4

kokorin commented 1 year ago

2 instances of Jaffee: Is this something can do in a single app such as two threads without two processes and no pipe file or localhost tcp streams?

Yes, you can run 2 instances in the same JVM instance. Just not forget that you need either to start in different threads, or need to use executeAsync() method.

concurrent queue: do you mean Java collections thread-safe queue?

yes

thread 1: grap BufferedImage RGB frames from source.mp4, add to queue | thread2: take frame from queue, draw overlay graphics, push BufferedImage to FrameInput -> ffmpeg encoding params to output.mp4

You've got the idea. Don't forget that you need to stop second ffmpeg gracefully (otherwise target file may be corrupted). So you need to send null with FrameInput to signal Jaffree there is no more frames.