fraunhoferhhi / vvdec

VVdeC, the Fraunhofer Versatile Video Decoder
https://www.hhi.fraunhofer.de/en/departments/vca/technologies-and-solutions/h266-vvc.html
BSD 3-Clause Clear License
429 stars 85 forks source link

Power consumption/CPU requirements question for a 4K VVC clip #184

Closed birdie-github closed 2 weeks ago

birdie-github commented 2 weeks ago

My Ryzen 7 7840HS is struggling to decode this VVC sample clip using vvdec-2.3/ffmpeg 6.1.1/mpv 0.38. Power consumption goes over 40W which is insane.

The clip spec are:

Stream #0:1[0x101]: Video: vvc (Main 10) (3[0][0][0] / 0x0033), yuv420p10le(tv, bt709), 3840x2160, 59.94 fps, 59.94 tbr, 90k tbn

Is the clip so "bad" or there's something wrong with the library or my config?

adamjw24 commented 2 weeks ago

~Do you have an annex B for this bitstream? I am struggling here a bit with handling the TS file.~

Edit: apparently very intuitively the vvc_mp4toannexb filter takes care of that.

adamjw24 commented 2 weeks ago

From the first look the sequence looks alright, not even particularly heavy - for a 4Kp60 stream of course.

Could you confirm that the power usage actually comes from VVdeC? I.e. extract the annex-B stream and decode it and report the power usage?

birdie-github commented 2 weeks ago

Could you confirm that the power usage actually comes from VVdeC?

perf record ffplay *ts : perf.zip

You could use perf report to see which functions stress the CPU and it's all VVdec from here.

lehmann-c commented 2 weeks ago

What are you expecting when decoding 4K@60Hz? That´s the expected consumption for VVC decoding. When decoding with vvdecapp the power consumption would go up to 80-100W as the vvdecapp will decode as fast as possible in comparsion with ffplay. Have you tested with the ffmpeg vvc decoder as well? It will need approx. 10-15W more power then vvdec.

birdie-github commented 2 weeks ago

@lehmann-c

I vaguely remember that VVC was promised to be twice as computationally expensive as HEVC, but it looks like it's more like 3-4 times more expensive, since at 4K the latter barely needs more than 15W for the same CPU.

Maybe there are some optimizations yet to be enabled 'cause this clip seemingly uses HDR or something and it's also at 60Hz.

Anyway, I've closed the issue since I'm not sure if it's valid.

adamjw24 commented 2 weeks ago

Maybe there are some optimizations yet to be enabled 'cause this clip seemingly uses HDR or something and it's also at 60Hz.

Hence my previous question. I think mpv would auto tune any bitstream color space to your display, no? This could add some cpu cycles, but probably nothing in the range of the actual decoding.

HDR processing itself is fairly transparent for the decoder, there's no additional stuff that needs to be done. It's the rest of the display pipeline that might need to do more stuff.