Open sakirsahin opened 3 years ago
Hey, What distribution are you using? Did you build from source, or did you install the release version?
Sorry for late answering. I've done some other tests too. Maybe it can help, when I set FPS at 60 from moonlight, both Xorg usage and sunshine usage is high. When I set it 30 FPS Xorg usage drops to 3% but sunshine remains at 15-20%. When I close game, sunshine usage drops to 9-10% and I dont see any Xorg usage (While doing normal things on desktop like surfing etc.) Setting bitrate 20 mbps or 60 mbps does not create any difference.
OS: Ubuntu 21.04 but converted it to Kubuntu without reinstall. Installed from deb release 0.10.1 Kernel: liquorix-amd64-5.13.0-16.1
If this CPU usage is not normal with NVENC, I can try to reinstall system, or try to build from source. ( I could not find any reference usage, I can only compare it with windows geforcenow.)
Best Regards. (Edit: add kernel info)
The current release version doesn't offload everything to the GPU yet for NVENC. The next release will be much better latency wise. :wink:
Thank you so much for your hard work and bringing such a good tool to Linux. I'm very eager about next release than =)
Hey! I'm using sunshine on a VM with a passed-through 1060 and 6 cores, 12GB RAM. Sunshine seems to be using the CPU to encode the stream (75% usage) and the GPU seems like it's not being used at all. I'm on the latest version (GIT AUR). Is this a known behavior? WHat can I do about it?
Thanks for this awesome project and have a nice day :)
You should update to lastest version from github release page. It should 0.11.1 after that you should enable nv-fbc and nvenc with nvlax or nvidia-patch https://github.com/illnyang/nvlax / https://github.com/keylase/nvidia-patch either of them should work but nvidia-patch have driver version limitation.
These should drop cpu usage to 5-6%. But now I have some videos glitches and wrong colors. Maybe that's only on me.
I have the latest version and did the Nvidia-patch but I can't use NVENC:
[2021:11:26:17:36:24]: Info: Trying encoder [nvenc]
[2021:11:26:17:36:24]: Info: Screencasting with X11
[2021:11:26:17:36:24]: Info: Color coding [Rec. 601]
[2021:11:26:17:36:24]: Info: Color range: [JPEG]
[2021:11:26:17:36:24]: Error: Could not open codec [h264_nvenc]: Function not implemented
NVENC works for me on ubuntu 20.04 after patching using nvidia-patch (using both patch.sh
and patch-fbc.sh
) with nvidia driver 495.44. My unoverclocked raspberry pi 2 can't run moonlight-embedded at 1080p/60fps without lags, but 1080p/30fps works great. Interestingly, with software encoder, it can run 1080p/60fps without issue. The lag is even worse on psvita moonlight client, the video is frozen when using nvenc, but works flawlessly using software encoder.
[2021:11:26:17:36:24]: Error: Could not open codec [h264_nvenc]: Function not implemented
This might not actually be a problem at all. The message is misleading because when you switch on full debug you will probably see that it is just failing to be able to use a specific feature of nvenc, not the whole thing. For me it is just that it can't use HDR10, which I suppose it expected on Linux.
Anyway, I see this within the "Testing for available encoders" section: Error: Could not open codec [h264_nvenc]: Function not implemented
because of this: [h264_nvenc @ 0x55b4dc6eed40] 10 bit encode not supported [h264_nvenc @ 0x55b4dc6eed40] Provided device doesn't support required NVENC features [h264_nvenc @ 0x55b4dc6eed40] Nvenc unloaded [2021:12:22:10:11:41]: Error: Could not open codec [h264_nvenc]: Function not implemented
But then later, it still uses NVENC: [2021:12:22:22:55:30]: Info: Found encoder nvenc: [h264_nvenc, hevc_nvenc]
I still get a fairly high CPU load with 0.11.1 and NVENC compared to Gamestream. Comparing streaming at 1440p/120Hz on Windows/Gamestream and Linux/sunshine when just streaming the desktop with some ufo thing to generate display changes:
nvstream.exe = 1% of CPU: https://i.ibb.co/XVBMfqb/gamestream.png
sunshine = maxes out 1 CPU thread and doesn't reach the 120fps https://i.ibb.co/440cfSd/sunshine.png
Neither option stretches the encoder on the host but it is using it.
I feel like I should provide an update to my previous message after more tinkering. Previously I was not using Screencasting with NvFBC as I did not know how to do this for Arch. I have now installed nvidia-utils-nvlax to unlock NvFBC and added 'cuda' 'libdrm' 'libcap' as build dependencies.
New result: sunshine = 13% CPU - rock solid 1440p/120Hz https://i.ibb.co/nzsFsb9/Moonlight-25-01-2022-19-26-43.png
Bye bye windows.
Two things that helped performance for me:
1) NvFBC: https://github.com/keylase/nvidia-patch (mentioned above) 2) Composition, needed for some games https://unix.stackexchange.com/questions/510757/how-to-automatically-force-full-composition-pipeline-for-nvidia-gpu-driver
Just going to add my experience here. I'm using Sunshine/Moonlight just for a VDI solution (no gaming) for a Linux VM. There is no GPU for this VM, so it's just a software encoding. In my case, I don't see any difference in CPU usage. But, after a few hours, everything still appears normal except for the fact that there is now a significant latency lag (like ~2 seconds) and that amount of delay just makes it unusable. Restarting the Sunshine service also doesn't appear to help. The only way I've found to fix it is by completely shutting Sunshine down and then starting it again manually instead of restarting. Once I do this, the latency lag disappears completely until another few hours. I don't know how long it takes for it to manifest, but I found that the next time I login to the VM after a few hours, it will experience this lag and I will then have to shutdown and start the service again to fix it.
Oh I didn't realize this wasn't the official repo. Thanks for the heads-up.
While gaming I observe high CPU usage which makes significant performance issues. According to Kysguard sunshine+Xorg consumes about 20-21% of my CPU. But when game is closed Xorg usage drops to 3-4% and sunshine continues as 10%. When I use geforcenow on windows I don't observe this situation, only geforcenow consumes about 6% of CPU.
I use following settings. Encoder: Nvenc Preset: llhp Bitrate Control: cbr Nvenc Coder: auto
Is there anything I can do with this situation by changing settings? Or will there be need any update on code?
Best Regards.