IGCIT / Intel-GPU-Community-Issue-Tracker-IGCIT

IGCIT is a Community-driven issue tracker for Intel GPUs.
GNU General Public License v3.0
115 stars 4 forks source link

10 bit SDR video output #658

Open BuyMyMojo opened 9 months ago

BuyMyMojo commented 9 months ago

Application [Required]

N/A

Processor / Processor Number [Required]

5900x

Graphic Card [Required]

A770

Rendering API [Required]

Windows Build Number [Required]

Other Windows build number

No response

Describe the feature [Required]

Displays can display full 10bit over HDMI and DP. A good list of monitors (rtings) This is even available on modern consoles (On the PS5 it is called deep colour output which can allow for bit depths of 10-12bit and reportedly 16bit too)

Having the option to output 10bit to my monitors would be very appreciated for colour work

Additional notes

No response

freak2fast4u commented 9 months ago

Agreed, IMHO this is more useful by itself than HDR overall.

pcslide commented 9 months ago

I don't understand what exactly you are asking for. Did you mean this option? 10bits

BuyMyMojo commented 9 months ago

I don't understand what exactly you are asking for. Did you mean this option? 10bits

image

not there for me with my A770

BuyMyMojo commented 9 months ago

image lol

BuyMyMojo commented 9 months ago

image even the beta version is missing it

kpsam2000Intel commented 9 months ago

In general exposing the final end display bits per color (BPC) is highly dependent on what panel expose as it's capability to the source device. i.e., it's dependent on panel's EDID. If it's a HDR panel usually it will support 10bpc. Also please note that advantage of outputting more than 8bpc is limited when desktop is in 8bpc non-HDR mode. When desktop gets to HDR/ACM mode and if panel support more than 8bpc, OS/driver will automatically enable more than 8bpc.

BuyMyMojo commented 9 months ago

In general exposing the final end display bits per color (BPC) is highly dependent on what panel expose as it's capability to the source device. i.e., it's dependent on panel's EDID. If it's a HDR panel usually it will support 10bpc.

I have three displays attached that all get 10bit output on my old 2070 super by default but all 3 are reporting only getting an 8bit signal. the issue is the setting is missing completely from both the arc software and the intel control software with my system with an A770 16GB reference card and a 5900x.

From #586 it seems some settings are restricted by if the CPU is intel or not, I wonder if this is one of those settings?

BuyMyMojo commented 8 months ago

Still not in driver 31.0.101.5085

jpovixwm commented 8 months ago

The "Color Depth" option in IGCC is only shown for displays connected to a native HDMI port on the GPU, of which A770 and A750 have zero. There exists a workaround for Intel platforms, where you can passthrough the video output via your iGPU: https://www.intel.com/content/www/us/en/support/articles/000092899/graphics.html But you're on AMD (me too!) so the workaround doesn't apply. Anyway, don't get your hopes up about this issue getting fixed any time soon.

This problem was previously discussed in #179 and #304, both of which are now closed without a proper solution.

PS. @IntelSupport-Arun said the following on Nov 21, 2022: (emphasis mine)

Due to current workloads, I would like to let you know it might take up to 12 months to complete.

BuyMyMojo commented 8 months ago

The "Color Depth" option in IGCC is only shown for displays connected to a native HDMI port on the GPU, of which A770 and A750 have zero.

does it not work on the HDMI port on my A770? is there something special about a native HDMI port?

plugs

jpovixwm commented 8 months ago

This port is using a DP->HDMI PCON chip. Effectively, it's the same as if you used an external DP->HDMI converter in any of the DP ports.

BuyMyMojo commented 8 months ago

oh that.. is certainly a design choice

bandit8623 commented 7 months ago

10bit option would be great to add. gradients so much better with 10bit. even on desktop. using hdr to get 10 bit is not optimal as most monitors you lose any ability to control brightness and gamma. https://www.eizo.be/monitor-test/ test the gradient test yourself.

lc200 commented 7 months ago

Found this thread after having the same issues with integrated graphics on an Intel i7 12700 and using the integrated graphics with two Dell monitors. I have one monitor on HDMI and the other on DisplayPort. The Monitor on the HDMI connection under the Intel Graphics Command Centre lets me select 8/10 or 12 bit for SDR at higher bit depths. The monitor connected to the DisplayPort has no options to select bit depth, the only way to get 10bit in Windows is to set the monitor to use HDR, but I don't want that.

There is no problem with the monitor on the DisplayPort connection running in 10 bit SDR as at the Bios splash screen or even entering the BIOS setup, the monitor is detected and sync's up at 4K/60Hz @ 30 bit automatically, only when it gets to Windows 11 does it switch to 4K/60Hz @ 24 bit, with no option in the Intel Graphics driver settings to switch it to anything else.

We need a fix please Intel.

theofficialgman commented 3 months ago

Cross posting for visibility since https://github.com/IGCIT/Intel-GPU-Community-Issue-Tracker-IGCIT/issues/179#issuecomment-2169147302 is closed

srgzx commented 3 months ago

"If it's a HDR panel usually it will support 10bpc"

HDR requires 10-bit color, so if the display supports HDR, it must also support a 10-bit color signal, and this is independent of HDR, so it should work in SDR mode

This doesn't mean the panel is a true 10-bit panel, it could be 8-bit with FRC (temporal dithering)

But in either case, it makes no sense at all to send an 8-bit color signal to a panel that is capable of receiving 10-bit color signal

But here's where things get messy...unfortunately, most applications on Windows don't support 10-bit color.

And I don't think Linux (in general) has support for this either.

Photoshop does, as do various other creative and modeling softwares, though. And with every other GPU, you don't have to enable HDR to get 10-bit color...it works in SDR mode.

rickbrew commented 3 months ago

unfortunately, most applications on Windows don't support 10-bit color.

Any app or game that uses DirectX Flip Mode and a high bit-depth swapchain format (like R10G10B10A10_UNORM or R16G16B16A16_FLOAT) supports 10-bit color output (source: https://learn.microsoft.com/en-us/windows/win32/direct3darticles/high-dynamic-range ). This includes some games (maybe a lot of them, I don't know), and can also include older games because (IIRC) Flip Mode is automatically forced with the "enable optimizations for windowed games" setting.

Windows 11 v24H2 is shipping with better Wide Color Gamut support that users can toggle on themselves, which works best with >8-bit output (otherwise sRGB apps, which is most of them, will lose precision and thus have banding). They're finally delivering on what they promised 2 years ago https://devblogs.microsoft.com/directx/auto-color-management/

Point being, 10-bit is definitely useful outside of HDR, and this could become very important soon if Microsoft decides to lean on this and even make it default to the "on" state for new systems.

image

jpovixwm commented 2 months ago

FYI: It seems to be working now with driver 32.0.101.5768 and IGCC v1.100.5536.0: image image (credits to sgredsch who posted a comment on videocardz.com)

brooksik commented 1 month ago

Моя видеокарта Intel ARC 750 LE, подключена по DP к монитору Samsung Odyssey G4 25", подтверждаю что Intel наконец-то добавили поддержку 10 Bit - SDR через IGCC (Центр управления графикой Intel) 🤟, в Intel Arc Control такой возможности нет. image image