moonlight-stream / moonlight-qt

GameStream client for PCs (Windows, Mac, Linux, and Steam Link)
GNU General Public License v3.0
10.37k stars 607 forks source link

[request] switch from DXVA to D3D11 decoding for Windows #317

Closed makedir closed 2 years ago

makedir commented 4 years ago

would it be possible, not sure how much work this is, to implement D3D11 decoding support instead of DXVA for Windows 10?

auto-comment[bot] commented 4 years ago

If this is a question about Moonlight or you need help troubleshooting a streaming problem, please use the help channels on our Discord server instead of GitHub issues. There are many more people available on Discord to help you and answer your questions.

This issue tracker should only be used for specific bugs or feature requests.

Thank you, and happy streaming!

cgutman commented 4 years ago

Yes, it's possible and I will be eventually doing it, however it's not compatible with Windows 7.

makedir commented 4 years ago

Nice. Windows 7 is EOL, no one is using it and if so, I wouldnt support it.

cgutman commented 4 years ago

I don't see a compelling reason to drop Win7 support in January even though it's EOL. It doesn't cost us much to support, and there are platforms (older AMD drivers) where D3D11VA is significantly worse than DXVA2. We will likely have to support both DXVA2 and D3D11VA for the near future, even on Windows 10.

The most interesting aspects of D3D11VA are the ability to support HDR and Rec 2020 color, UWP support, and waitable swapchains for lower latency when not in full-screen exclusive mode.

makedir commented 4 years ago

No the most interesting aspect of D3D11 is that it uses less CPU and less GPU too. Windows actually shows me that Moonlight is using the 3d part of the GPU not the decoder part.

So Moonlight streaming uses a flawed (low) color depth? I asked this before, that I notice gray color banding with the stream: https://i.imgur.com/5SVq0R3.png

Can this be fixed or is it an issue of Nvidia?

I also read Windows 10 will drop DXVA support but I am not sure if thats true.

cgutman commented 4 years ago

What evidence do you have that D3D11VA is more efficient than DXVA2? I have not seen indications of that anywhere.

Moonlight uses the DXVA2 video processing API (IDirectXVideoProcessor) to do correct colorspace conversion and scaling (D3D11VA has a ID3D11VideoProcessor for the same purpose) - that's what you see showing up on the 3D usage. D3D11VA will very likely have the same behavior.

I'm not sure about the banding. I think it may be on the capture side but I'm not 100% sure.

makedir commented 4 years ago

I noticed less cpu and gpu usage in the past in VLC between DXVA and D3D11 and also in Chrome which supports D3D11 sine some versions you can force to.

But you see the color banding in that picture, right? Can you test somehow if it happens on other clients, I dont have a Nvidia shield console actually. I noticed it in some games a lot for example in Diablo 3.

cgutman commented 4 years ago

You can't really necessarily reason about CPU and GPU usage using other applications. Chrome, for example, composites the resulting output into a window with other HTML elements which are also rendered using D3D11, so it makes sense that using D3D11 is more efficient there.

Moonlight's CPU usage is so low that it's mostly noise anyway. I'm under 2% CPU usage on a Ryzen 1700X when decoding 1440p 120 Hz. Even a crazy CPU reduction of 50% (which I think is highly unlikely) would not make any real difference to performance or power consumption. We're bottlenecked on the actual GPU decoding capabilities long before we hit API or CPU limitations.

Let's keep the discussion to a single issue/request here. I'm aware of the banding.

sneakpeekdev commented 3 years ago

Hi there, are there any news about the color banding? Having the same issue here. Moonlight really works perfect, except for the color banding. Using GTX 1080 / Driver 456.71 on the host side.

image

sbernier1 commented 3 years ago

any update on this? it would be nice to be able to use HDR

cgutman commented 2 years ago

D3D11VA is now implemented, but DXVA2 is still the default for SDR streaming until the D3D11 backend has gotten more testing and tuning.

crzsotona commented 7 months ago

D3D11VA is now implemented, but DXVA2 is still the default for SDR streaming until the D3D11 backend has gotten more testing and tuning.

@cgutman, is there any way to enable it? I have a PC with two GPUs, ancient Radeon 5770 and Intel HD4000. Radeon cannot decode 1080p at 60fps (decoding lags up to 200ms, constantly) while HD4000 can but it's available only as D3D11 device (no monitor connected).

Parsec, for example, allows to select HD4000 as decoder device and I can stream H264 up to 4k pretty smoothly.

crzsotona commented 6 months ago

Well, it seems that no, no setting is exposed to user, sadly. Anyone could point me where this DXVA/D3D11 detection of takes place?

cgutman commented 6 months ago

The nightly builds already have switched to D3D11VA. You can download the Windows nightly here: https://ci.appveyor.com/project/cgutman/moonlight-qt/branch/master/job/nylnpsdetoh7b5fr/artifacts