Closed bavdevc closed 1 year ago
Skylake doesn't support HEVC 10 bit, Intel first 10 bit HEVC generation was Kabylake. AV1 detection seems wrong by the way.
@bavdevc I also think Skylake should have HEVC 10bit hw decode partially supported. (not encode)
Although I have no Skylake environment to test, I've found that --disable-va was not working in Kabylake environment I have returnig the similar error message you have mentioned. I had it fixed in QSVEnc 5.06, so I hope it works on your Skylake system too. Please have a try.
Please note that OpenCL based filters are not supported when using --disable-va.
@mikk9 I've checked about the AV1 hw decode detection, but somehow the driver on Kabylake (and probably Skylake also) reports no error when the app asks the driver about support for AV1 decode. I kept it as-is by now as the driver might get fixed, but we might need a work around in the future.
Although I have no Skylake environment to test, I've found that --disable-va was not working in Kabylake environment I have returnig the similar error message you have mentioned. I had it fixed in QSVEnc 5.06, so I hope it works on your Skylake system too. Please have a try.
The --disable-va fix is tested in 5.06 - working great so far
The HEVC MAIN10 HW decoding still throws: Failed to initialize decoder. : undeveloped feature.
--> What is the best way to analyse that error in more detail? I don't have VTune profiler or anything similar installed atm. perhaps mikk9 is correct when saying "unsupported" - the information from intel is inconsistent at best
Please note that OpenCL based filters are not supported when using --disable-va.
true - and that makes missing HEVC 10bit HW decoding functionality in Skylake a minor issue - is the limitation with 10bit decoding and OpenCL filters also in the more recent intel generations?
There is no 10 bit HEVC hardware support in Skylake, Kabylake was Intels first. The only solution could be a shader based GPU acceleration solution (Hybrid Decoder) which could offload some work from CPU to the GPU for video playback. But even then it's another question if such GPU shader based solution is supported for encoding+decoding tasks.
https://github.com/intel/media-driver#decodingencoding-features
SKL HEVC 10bit....nothing there.
There is no 10 bit HEVC hardware support in Skylake, Kabylake was Intels first. The only solution could be a shader based GPU acceleration solution (Hybrid Decoder) which could offload some work from CPU to the GPU for video playback.
I think you are right - not every gen9 supports HEVC Main10 decoding:
Braswell | gen8 | H.265 decode; JPEG, VP8 encode. Skylake | gen9 | H.265 encode. Apollo Lake | gen9 | VP9, H.265 Main10 decode. Kaby Lake | gen9.5 | VP9 profile 2 decode; VP9, H.265 Main10 encode.
I was misled by the skylake gpu presentation: https://www.anandtech.com/show/9562/intels-skylake-gpu-analyzing-the-media-capabilities
Skylake's MFX engine adds HEVC Main profile decode support (4Kp60 at up to 240 Mbps). Main10 decoding can be done with GPU acceleration. The Quick Sync PG Mode supports HEVC encoding (again, Main profile only, with support for up to 4Kp60 streams). The DXVA Checker screenshot (taken on a i7-6700K, a part with Intel HD Graphics 530 / GT2) for Skylake with driver version 10.18.15.4248 is produced below. HEVC_VLD_Main10 has a DXVA profile, but it is done partially in the GPU (as specified in the slide above).
and the feature table at wikichip.org.
Thank you all for your help!
kind regards
@rigaya the program flow with RGY_WRN_PARTIAL_ACCELERATION is not used in that case - I couldn't find any log/debug messages with ...partial...
the output of --check-features still says: qsv_features_i7-6700k.txt
decoding HEVC yuv420 10bit should be possible - perhaps the feature detection is a bit off for HEVC like it is for AV1 (driver issue)
The result of --check-features (which is the result of Query API of Media SDK) seems to be wrong.
I've found that Intel vaapi driver supports HEVC 10bit hw decode from Broxton, not from Skylake. https://github.com/intel/intel-vaapi-driver/blob/master/README#L50
Therefore, I think HEVC 10bit hw decode on Skylake Linux is not possible.
question answered - hardware doesn't support it Thank you
Hello rigaya,
I'm using the Skylake feature set, so 10bit HEVC HW decoding should be possible:
and https://en.wikichip.org/wiki/intel/hd_graphics/530#Hardware_Accelerated_Video
but the current qsvenc build exits with an error: Failed to initialize decoder. : undeveloped feature.
@lizhong1008 (from Intel) wrote 2 years ago:
but when I try to use system memory with option --disable-va (Linux) I get: MFXVPP: Failed to get required buffer size for MFXVPP: undeveloped feature. MFXVPP: Failed to get required buffer size for MFXVPP: undeveloped feature. MFXVPP: Failed to initialize vpp: invalid video parameters..
--> Please help me if it should be available/possible, I was not able to fix that on my own.
kind regards