rigaya / QSVEnc

QSVによる高速エンコードの性能実験
http://rigaya34589.blog135.fc2.com/blog-category-10.html
Other
322 stars 31 forks source link

10bit HW decoding not working (skylake - linux) #63

Closed bavdevc closed 1 year ago

bavdevc commented 3 years ago

Hello rigaya,

I'm using the Skylake feature set, so 10bit HEVC HW decoding should be possible:

Supported Decode features:

            H.264  HEVC   MPEG2  VP8    VP9    AV1    
    yuv420  8bit  10bit   8bit   8bit         10bit 
    yuv422                                          
    yuv444                                    12bit 

and https://en.wikichip.org/wiki/intel/hd_graphics/530#Hardware_Accelerated_Video

but the current qsvenc build exits with an error: Failed to initialize decoder. : undeveloped feature.

@lizhong1008 (from Intel) wrote 2 years ago:

I checked MediaServerStudioProfessional2017 release guide, it is said that: MAIN10 profile encoding/decoding and range extensions profile encoding can be done only with system memory at input and output on linux (SDK/driver limitation for video memory frames allocation)

but when I try to use system memory with option --disable-va (Linux) I get: MFXVPP: Failed to get required buffer size for MFXVPP: undeveloped feature. MFXVPP: Failed to get required buffer size for MFXVPP: undeveloped feature. MFXVPP: Failed to initialize vpp: invalid video parameters..

--> Please help me if it should be available/possible, I was not able to fix that on my own.

kind regards

mikk9 commented 3 years ago

Skylake doesn't support HEVC 10 bit, Intel first 10 bit HEVC generation was Kabylake. AV1 detection seems wrong by the way.

rigaya commented 3 years ago

@bavdevc I also think Skylake should have HEVC 10bit hw decode partially supported. (not encode)

Although I have no Skylake environment to test, I've found that --disable-va was not working in Kabylake environment I have returnig the similar error message you have mentioned. I had it fixed in QSVEnc 5.06, so I hope it works on your Skylake system too. Please have a try.

Please note that OpenCL based filters are not supported when using --disable-va.

@mikk9 I've checked about the AV1 hw decode detection, but somehow the driver on Kabylake (and probably Skylake also) reports no error when the app asks the driver about support for AV1 decode. I kept it as-is by now as the driver might get fixed, but we might need a work around in the future.

bavdevc commented 3 years ago

Although I have no Skylake environment to test, I've found that --disable-va was not working in Kabylake environment I have returnig the similar error message you have mentioned. I had it fixed in QSVEnc 5.06, so I hope it works on your Skylake system too. Please have a try.

The --disable-va fix is tested in 5.06 - working great so far

The HEVC MAIN10 HW decoding still throws: Failed to initialize decoder. : undeveloped feature.

--> What is the best way to analyse that error in more detail? I don't have VTune profiler or anything similar installed atm. perhaps mikk9 is correct when saying "unsupported" - the information from intel is inconsistent at best

Please note that OpenCL based filters are not supported when using --disable-va.

true - and that makes missing HEVC 10bit HW decoding functionality in Skylake a minor issue - is the limitation with 10bit decoding and OpenCL filters also in the more recent intel generations?

mikk9 commented 3 years ago

There is no 10 bit HEVC hardware support in Skylake, Kabylake was Intels first. The only solution could be a shader based GPU acceleration solution (Hybrid Decoder) which could offload some work from CPU to the GPU for video playback. But even then it's another question if such GPU shader based solution is supported for encoding+decoding tasks.

https://github.com/intel/media-driver#decodingencoding-features

SKL HEVC 10bit....nothing there.

bavdevc commented 3 years ago

There is no 10 bit HEVC hardware support in Skylake, Kabylake was Intels first. The only solution could be a shader based GPU acceleration solution (Hybrid Decoder) which could offload some work from CPU to the GPU for video playback.

I think you are right - not every gen9 supports HEVC Main10 decoding:

Braswell | gen8 | H.265 decode; JPEG, VP8 encode. Skylake | gen9 | H.265 encode. Apollo Lake | gen9 | VP9, H.265 Main10 decode. Kaby Lake | gen9.5 | VP9 profile 2 decode; VP9, H.265 Main10 encode.

I was misled by the skylake gpu presentation: https://www.anandtech.com/show/9562/intels-skylake-gpu-analyzing-the-media-capabilities

Skylake's MFX engine adds HEVC Main profile decode support (4Kp60 at up to 240 Mbps). Main10 decoding can be done with GPU acceleration. The Quick Sync PG Mode supports HEVC encoding (again, Main profile only, with support for up to 4Kp60 streams). The DXVA Checker screenshot (taken on a i7-6700K, a part with Intel HD Graphics 530 / GT2) for Skylake with driver version 10.18.15.4248 is produced below. HEVC_VLD_Main10 has a DXVA profile, but it is done partially in the GPU (as specified in the slide above).

and the feature table at wikichip.org.

Thank you all for your help!

kind regards

bavdevc commented 3 years ago

@rigaya the program flow with RGY_WRN_PARTIAL_ACCELERATION is not used in that case - I couldn't find any log/debug messages with ...partial...

the output of --check-features still says: qsv_features_i7-6700k.txt

decoding HEVC yuv420 10bit should be possible - perhaps the feature detection is a bit off for HEVC like it is for AV1 (driver issue)

rigaya commented 3 years ago

The result of --check-features (which is the result of Query API of Media SDK) seems to be wrong.

I've found that Intel vaapi driver supports HEVC 10bit hw decode from Broxton, not from Skylake. https://github.com/intel/intel-vaapi-driver/blob/master/README#L50

Therefore, I think HEVC 10bit hw decode on Skylake Linux is not possible.

bavdevc commented 1 year ago

question answered - hardware doesn't support it Thank you