Closed lavadrop closed 2 years ago
FFXIV reports 59 fps, but Remote Play drops to 30 fps.
No Man's Sky drops to 30 fps in borderless window when the program isn't focused, like when Steam Remote Play is capturing the background window.
No Man's Sky returns to 60+ fps when it gains focus again.
Super Indie Karts has both a Native version and a Windows version to be used through Proton. Native version makes Steam Remote Play encode video at 60 fps, but Proton drops it to 30, wether the program is focused or not.
Native
Proton
i have also noticed remote play capturing at a very low framerate recently. i remember it being a lot better. disabling hardware acceleration gives me a couple more fps but most games are unplayable like this at around 24 fps strangely Halo MCC has great capture performance even though it's running through proton & dxvk, so i'm not sure that's the sole reason...
I am on Arch Linux and have had a similar problem, both at 1080p and 1440p. It worked fine for me in December. I will provide further details later when I've had to test a variety of clients, and will post some logs too.
Hi, I'm having the exact same issue. To me it only happens on games that run on Proton, native games can be streamed just fine at 60 fps.
This issue started happening to me since last stable release of steam client.
Things that I've notice:
No matter what I try cannot get back to normal 60 FPS.
My pc specs:
OS: Manjaro Linux KERNEL: 5.16.11-2-MANJARO CPU: AMD Ryzen 7 5800X 8-Core GPU: AMD Radeon RX 5700 XT (NAVI10, DRM 3.44.0, 5.16.11-2-MANJARO, LLVM 13.0.1) GPU DRIVER: 4.6 Mesa 21.3.7 RAM: 16 GB
Here are a couple of screenshots I took, one from Dragon's Dogma (Proton) locked at 30 FPS and the other from Don't Starve Together (native) running just fine.
The same is happening here.
As long as the game is focused encoding switches from OpenGL NV12+libx264 to "Game Vlukan RGB + libyuv + libx264" Exactly like back in https://github.com/ValveSoftware/steam-for-linux/issues/7185
If i alt-tab out to the desktop it returns to smooth and locked 60fps software encoding.
So clearly libyuv cant handle 1920x1200@60fps - Not even with 8 encoding Threads on a 2700x. (Or the Vulkan Proton Game Capture is broken)
Steam devs please allow me to untick this - i'd much rather have an additional frame of lag but buttery smooth 60fps over this thing. Or fix libyuv to be able to handle 60fps.
Warframe, focused:
Warframe, alt-tabbed to desktop (Forcing opengl NV12 capture):
Also, as expected disabling dxvk and forcing PROTON_USE_WINED3D=1 %command%
works around the issue of course.
I can confirm the issue : "Game Vulkan RGB + libyuv + libx264" works for 1280x720@60fps,but is limited at 30fps for 1920x1080 Even with 8 encoding threads on Ryzen 3600X. (On Ubuntu 21.10) By the way, I have an AMD RX 5700XT and I can't manage to activate the "hardware encoding".
Anyway, thanks @waffshappen for the workaround with PROTON_USE_WINED3D=1 %command%
I can confirm the issue : "Game Vulkan RGB + libyuv + libx264" works for 1280x720@60fps,but is limited at 30fps for 1920x1080 Even with 8 encoding threads on Ryzen 3600X. (On Ubuntu 21.10) By the way, I have an AMD RX 5700XT and I can't manage to activate the "hardware encoding".
Anyway, thanks @waffshappen for the workaround with
PROTON_USE_WINED3D=1 %command%
By 1280x720 you mean having the game you are streaming at that resolution or having the client at 720p?
I cannot get 60 fps at 720p either.
Edit: I managed to get 60fps by running my games windowed mode and setting the in-game resolution to 720. Is not ideal, but at least is not 28fps. Hope we get an update soon.
I managed to get 60fps by running my games windowed mode and setting the in-game resolution to 720. Is not ideal, but at least is not 28fps.
@Siysrril
Yes, that is exactly how I reached 60 fps with dxvk.
But it's better to put PROTON_USE_WINED3D=1 %command%
in the Launch Options
of your game. If your game is not too demanding, you don't need dxvk : you can achieve better result (1920x1080@60fps) with OpenGL.
Suffering from the same issue it has something to do with the fact that Encoding is done through
Game vulkan RGB + libyuv + [XXXXXX]
. Display latency bumps to 100+ms instead of the usual 20ms
OpenGL games work fine when streamed from my Linux host and also the steam UI itself.
Because they use Game Delayed OpenGL NV12 + [xxxx]
Its a Vulkan/Proton issue the way the frames are being captured/encoded
I think this other issue refers to this same issue. https://github.com/ValveSoftware/steam-for-linux/issues/5591
Worth noting that in some games forcing opengl mode via the launch parameter above doesn't entirely fix the issue - recently I had this problem with Cloudpunk and got slightly better results using the parameter, but still way worse latency/frame timing than on desktop. Have been having similar issues with Elden Ring, but haven't tried the parameter there yet.
I too seem to remember this not being a problem last year, but the linked issue above has reports going back a long time.
Seems to also be the case for custom proton builds such as TKG. I haven't yet tried using a non-protonified wine build to see the results there. But I've seen reports that windows users have had similar issues so it may be more to do with the steam streaming tech in general than any specific component. It's difficult to pinpoint exactly on what layer this problem is happening.
As we can start a bash script from a game's Launch Options
, I would like to detect if "remote play" is active so I can put PROTON_USE_WINED3D=1
behind a condition. (Let's be honest, I have some small graphics with OpenGL that are not present with dxvk
. So I prefer to use dxvk
when I'm not using remote play)
And then I can use this basch script as a workaround for every game that has this issue.
Any idea about how to do it ? Is there a variable or a process name to check ?
Any idea about how to do it ? Is there a variable or a process name to check ?
Off the top of my head, I don't know a process, though there probably is one you can grep out of a process list if you tested it. But one thing that comes to mind if using nvidia hardware is maybe nvidia-smi
can be queried to see if NVENC is active or not.
So here is my script to manage both local play and remote play. (For DARK SOULS™ II: Scholar of the First Sin, but it can be easily modified for any game) My setup : Local PC is Ubuntu 21.10 + Xorg in 2160p. Remote is Nvidia Shield with 1080p display.
Step 1 : configure your video settings in Dark Souls 2 for remote play, quit the game and go in ~/.steam/debian-installation/steamapps/compatdata/335300/pfx/drive_c/users/steamuser/AppData/Roaming/DarkSoulsII
Then manually copy GraphicsConfig_SOFS.xml
to GraphicsConfig_SOFS.xml.remoteplay
Step 2 : configure your video settings in Dark Souls 2 for local play, quit the game and go in ~/.steam/debian-installation/steamapps/compatdata/335300/pfx/drive_c/users/steamuser/AppData/Roaming/DarkSoulsII
Then manually copy GraphicsConfig_SOFS.xml
to GraphicsConfig_SOFS.xml.localplay
Step 3 : create the startup script. For example :
$ nano ~/scripts/DarkSouls2.sh
Inside, write :
#!/bin/bash
DS2FOLDER="$HOME/.steam/debian-installation/steamapps/compatdata/335300/pfx/drive_c/users/steamuser/AppData/Roaming/DarkSoulsII"
#test if remote play is active by checking if the process steam_monitor is present
if pidof "steam_monitor" ;then
# remote play is active. Use OpenGL and video settings optimized for remote play (1080p. Window mode is not needed for OpenGL. It's only needed for remote playing with Vulkan)
export PROTON_USE_WINED3D=1
cp $DS2FOLDER/GraphicsConfig_SOFS.xml.remoteplay $DS2FOLDER/GraphicsConfig_SOFS.xml
else
# remote play is not active. Use DXVK and video settings optimized for local play (fullscreen + 2160p)
export PROTON_USE_WINED3D=0
cp $DS2FOLDER/GraphicsConfig_SOFS.xml.localplay $DS2FOLDER/GraphicsConfig_SOFS.xml
fi
#start Dark Souls 2
"$@"
Make it executable :
$ chmod +x ~/scripts/DarkSouls2.sh
Step 4 :
Fill the game's Launch Options
(Right-clic the game in the Steam Library / Properties) like this :
~/scripts/DarkSouls2.sh %command%
Enjoy local and remote play freely
I did some testing after noticing that some games installed through Lutris and added to Steam as "Non-Steam" games didn't have this issue, I noticed that, for those games, the steam overlay did not load properly.
So I disabled the overlay for some Steam games by adding this to the launch options: LD_PRELOAD= %command%
With that argument the "slow convert" issue doesn't happen, of course there are some side effects while adding that argument, like controller configuration in game not accessible.
To work around that I configure the controller for the game before adding the argument to the launch options.
Also, I think that, with that argument in the launch options, most of what makes the Steam Controller so good is also not accessible.
To me this has been the best workaround I've found so far.
I hope is useful for somebody else.
Edit: Forgot to mention, disabling the overlay in Steam settings > In-game section, doesn't have the same effect as adding LD_PRELOAD= %command% in the launch option, so maybe the issue not the Overlay either?
So I disabled the overlay for some Steam games by adding this to the launch options: LD_PRELOAD= %command%
With that argument the "slow convert" issue doesn't happen, of course there are some side effects while adding that argument, like controller configuration in game not accessible.
For me, regarding the encoder used, it does the same as PROTON_USE_WINED3D=1
: it forces the OpenGL encoder instead of Vulkan.
Replying to https://github.com/ValveSoftware/steam-for-linux/issues/8423#issuecomment-1100921734
PROTON_USE_WINED3D=1 worked for me too, however performance on a lot of my games was really bad, so I had to look for an alternative.
for me, disabling the steam overlay via the steam setting (without LD_PRELOAD= in launch options) causes the opengl encoder to be used, which results in 60fps (as long as my gpu isn't at 100%...) capture, thanks @Siysrril ! i restarted steam after disabling the steam overlay globally. maybe that makes a difference, since @Siysrril didn't notice an effect
When playing TUNIC in Proton with Remote Play, if I start the game in Windowed mode, and then go into the settings and change it to Fullscreen, the resolution change somehow bumps the streaming to "Desktop OpenGL" and I get smooth 60fps. (Changing it back to Windowed doesn't hurt anything at that point, and makes it so I can use the trick again on the next launch).
@Siysrril @VoodaGod Instead of nuking LD_PRELOAD altogether, you could also use PRESSURE_VESSEL_REMOVE_GAME_OVERLAY=1 ... Seems to have the same effect, albeit with the already listed side effects.
Just wanting to shed some light on this as it is still unsolved and still a pretty huge issue. Any capture of a game rendering with vulkan is limited to ~30 fps. This includes both proton+DXVK and native vulkan. Switching to OpenGL rendering for these titles isn't really viable, nor is disabling the overlay as it renders my controller useless.
I'm trying to investigate where the actual bottleneck is occuring [somewhere in the Vulkan capture or perhaps in libyuv?], hopefully we can make some progress on this issue if any subscribers have some further information they have since discovered.
No new info from me other than it still happens and tbh it's put me off from using remote play as much as I used to. I would say the "limit" isn't a cap or hard limit like vsync or frame limiting. On my system it can be more like 43 fps or something, but the frame timings are really inconsistent and input latency very high.
If the bottleneck is libyuv and can be identified, I wonder if a patched libyuv could be force loaded over the static linked version to help remedy the issue? Depends on exactly what's going on with it. Maybe a debug build of libyuv could be loaded with an LD_
environment variable and profiled somehow? Not sure how feasible it is to do that.
Well, I tried, and had precisely zero luck, and I've moved over to Moonlight. The input latency feels slightly worse, and the steam controller integration isn't great, but the video, sound and connection are all excellent.
On my Manjaro KDE X11 setup, I don't have to put WINED3D in launch command. Instead I disable steam overlay and put at least gamescope -f -- %command% in launch command (must have gamescope installed) and somehow it works at 1080p 60 fps. Gamescope seems to keep the Desktop OpenGL NV12 encoder, possibly leading to a smoother experience. Tested on Batman Arkham Knight, Megaman Legacy Collection, Cuphead and Spyro Reignited Trilogy, using an AMD RX 580.
I was able to force the OpenGL encoder just by disabling the steam overlay even without gamescope.
Unfortunately disabling the steam overlay breaks steam input completely.
I have a theory I'd like to confirm. How many people here are using a monitor that supports 10bit color/HDR? I know Linux doesn't support it but some of the backend code in randr at the very least detects it and I'm wondering if that isn't being accounted for in the encoder.
I have a theory I'd like to confirm. How many people here are using a monitor that supports 10bit color/HDR? I know Linux doesn't support it but some of the backend code in randr at the very least detects it and I'm wondering if that isn't being accounted for in the encoder.
Hmm, my primary monitor supports it, but I have it disabled in the monitor's settings. I've no idea if disabling it there prevents it detecting support or not though, is there a way to check?
I have a theory I'd like to confirm. How many people here are using a monitor that supports 10bit color/HDR? I know Linux doesn't support it but some of the backend code in randr at the very least detects it and I'm wondering if that isn't being accounted for in the encoder.
Hmm, my primary monitor supports it, but I have it disabled in the monitor's settings. I've no idea if disabling it there prevents it detecting support or not though, is there a way to check?
Try: xwininfo -root | grep Depth If you get 24 that's 8 bit of you get 30 that's 10 bit. Afaik xrandr sets it based on the edid data it's presented
I have a theory I'd like to confirm. How many people here are using a monitor that supports 10bit color/HDR? I know Linux doesn't support it but some of the backend code in randr at the very least detects it and I'm wondering if that isn't being accounted for in the encoder.
Both of my monitors do indeed support HDR (however I'm not sure which "standard" they support. I assume 10 bit)
xwininfo -root | grep depth
returns Depth: 24
for me. I believe I have HDR disabled on both my monitors as well.
I THINK I see where you're going with this and I like it.
I have a theory I'd like to confirm. How many people here are using a monitor that supports 10bit color/HDR? I know Linux doesn't support it but some of the backend code in randr at the very least detects it and I'm wondering if that isn't being accounted for in the encoder.
Both of my monitors do indeed support HDR (however I'm not sure which "standard" they support. I assume 10 bit)
xwininfo -root | grep depth
returnsDepth: 24
for me. I believe I have HDR disabled on both my monitors as well.I THINK I see where you're going with this and I like it.
Hmmm well I had this epiphany sitting at my desk at work. I'm getting off in 15 minutes and I have one monitor that's a TN panel that barely even has display port I'm going to try remote play with my HDR monitor completely unplugged and report back.
I have a theory I'd like to confirm. How many people here are using a monitor that supports 10bit color/HDR? I know Linux doesn't support it but some of the backend code in randr at the very least detects it and I'm wondering if that isn't being accounted for in the encoder.
Hmm, my primary monitor supports it, but I have it disabled in the monitor's settings. I've no idea if disabling it there prevents it detecting support or not though, is there a way to check?
Try: xwininfo -root | grep Depth If you get 24 that's 8 bit of you get 30 that's 10 bit. Afaik xrandr sets it based on the edid data it's presented
$ xwininfo -root | grep Depth
Depth: 24
I have a theory I'd like to confirm. How many people here are using a monitor that supports 10bit color/HDR? I know Linux doesn't support it but some of the backend code in randr at the very least detects it and I'm wondering if that isn't being accounted for in the encoder.
Both of my monitors do indeed support HDR (however I'm not sure which "standard" they support. I assume 10 bit)
xwininfo -root | grep depth
returnsDepth: 24
for me. I believe I have HDR disabled on both my monitors as well.I THINK I see where you're going with this and I like it.
I think I found what I was looking for Normal Monitor
sudo cat /sys/kernel/debug/dri/0/DP-1/output_bpc
Current: 8
Maximum: 8
HDR Monitor
sudo cat /sys/kernel/debug/dri/0/DP-2/output_bpc
Current: 10
Maximum: 10
Here's my result on both monitors: I have HDR turned off on both
Current: 8 Maximum: 10
Unfortunately. I'm still experiencing the same behavior.
Here's my result on both monitors: I have HDR turned off on both
Current: 8 Maximum: 10
Unfortunately. I'm still experiencing the same behavior.
tried with only regular monitor connected but I did learn something. my second monitor is 1080p. and when I ran remote play on it I was getting ~44 fps as opposed to the consistent 22 I was getting on my 1440p monitor. so I tried setting my main monitor to 720 and with that I could easily get solid 60fps so I think the bottleneck is in the initial capture. for some reason it's not actually able to use a lower capture resolution and that's slowing down the whole process. but that is strange given I can normally easily record 1440p 60fps video using OBS with VAAPI
I had to use
sudo cat /sys/kernel/debug/dri/0/HDMI-A-1/output_bpc
Current: 8
Maximum: 8
... so no HDR. However, I'm seeing the same behavior: reducing resolution to 720p in game gives a solid 60 fps.
I had to use
sudo cat /sys/kernel/debug/dri/0/HDMI-A-1/output_bpc Current: 8 Maximum: 8
... so no HDR. However, I'm seeing the same behavior: reducing resolution to 720p in game gives a solid 60 fps.
interestingly I only get the performance boost if I set my display resolution. in game does nothing
interestingly I only get the performance boost if I set my display resolution. in game does nothing
I suspect that's to do with Proton's "fake fullscreen" hack/patch, which stretches the output to the native resolution rather than triggering a resolution switch on the monitor itself. So the captured video in that case is still the full native resolution.
I noticed that setting a game to windowed mode often gives a performance boost for the same reason though. It just seems like something is CPU bound, basically, and whatever Steam is doing to capture in Vulkan has some massive inefficiency or mistake in the implementation that no one has looked into in years.
It seems to be libyuv. Which is absolutely bizarre, colorspace conversion shouldn't be a major factor on systems that can play the games in the first place, and libyuv is well-optimized, supports SIMD, etc. But maybe there's something in the way that all of the layers are glued together that incurs a really high memory access penalty or something like that.
I've tested on my Steam Deck and it's still bottlenecking, which totally tracks given it's essentially a Linux PC.
It seems to be libyuv. Which is absolutely bizarre, colorspace conversion shouldn't be a major factor on systems that can play the games in the first place, and libyuv is well-optimized, supports SIMD, etc. But maybe there's something in the way that all of the layers are glued together that incurs a really high memory access penalty or something like that.
Yeah my feeling is that it's something in the steam middleware that's perhaps marshalling the data inefficiently, or doing some expensive array conversion or something. It can't be libyuv by itself, I have other software that uses it that doesn't have a problem.
It seems the conversion from RGB to YUV420 impacts so much as to only work with resolutions lower than 720p.
I'm 95% sure I've actually figured out the issue this time. I did some A/B testing using Valheim because it has Vulkan and OpenGL support. I measured average bitrate and fps in both modes and used them to find the kbits per frame. In opengl on the title screen I get 60fps using ~15.5mbit/s meaning ~258.333 kbit per frame in vulkan on the title screen I get 20fps using ~10.4mbit/s meaning ~520 kbit per frame
now we know that in OpenGL we are using NV12 pixel format which is 4:2:0 meaning 8 bits per pixel so if we divide 258 by 8 we get ~32.29 using this we can derive the bits per pixel used in vulkan by dividing by this value which gets 16 bits per pixel basically double. so obviously the bottleneck is that the conversion isn't going to an 8bpp format for some reason. now if only we could get someone from valve to look at why. My guess is it's using a 4:2:2 format like nv16 instead of nv12 for some reason.
Hello,
I am attempting to replace my Windows box with Linux Manjaro, the primary use-case is gaming via Steam. Everything works locally, even via Proton for Windows Games, but I am having difficulties with streaming games to my TV.
From my experience, the primary screen resolution needs to match the client in order for Remote Play to work without gamescope
. This is related to another open issue https://github.com/ValveSoftware/steam-for-linux/issues/7130#issuecomment-1217767064
I have tried the following:
PROTON_USE_WINED3D=1
causes game to crash and not startLD_PRELOAD= %command%
causes controller not to workPRESSURE_VESSEL_REMOVE_GAME_OVERLAY=1
causes controller not to workMy system info can be found at the following comment (where you can see the current issues I am having with Sunshine) - https://github.com/LizardByte/Sunshine/issues/176#issuecomment-1216369368
I also made a few experiments. Connected my Steamdeck to my Desktop PC (equipped with an RX580) via LAN (no Wifi). The Steamdeck was connected to an Asus 16:10 monitor (1680x1050) using an USB-C hub. I took Aragami as test subject, because this game runs well via Proton, and it has a native Linux port. I report the session stats logged by the Steam client in streaming_log.txt in Steam's logs folder.
Results: @ 1920x1080 proton: ~25 fps capture: 10.45 convert: 28.49 encode: 10.85 network: 1.43 decode: 0.38 display: 0.62 (in ms, avg) @ 1680x1050 proton: ~30 fps capture: 7.37 convert: 17.28 encode: 12.76 network: 1.32 decode: 0.38 display: 0.61 (in ms, avg) @ 1280x800 proton: ~58 fps capture: 7.61 convert: 9.82 encode: 6.65 network: 1.40 decode: 0.31 display: 0.59 (in ms, avg) @ 1280x720 proton: ~60 fps capture: 5.48 convert: 8.96 encode: 6.87 network: 1.91 decode: 0.31 display: 0.63 (in ms, avg)
@ 1920x1080 native: capture: 0.34 convert: 0.011 encode: 12.48 network: 1.53 decode: 0.36 display: 0.81 (in ms, avg)
As it seems, the stream switches between 60 fps, 30 fps, and 15 fps targets.
Full log is attached.
How odd. you can see in there that it is indeed targeting 420 and not 422 so it should be using an 8bpp format but obviously from the network traffic it's not.
argh google somehow failed me. I could have sworn at first that NV12 was 12 BPP and I was right. I googled the other day to double check myself and the top google result said it was 8bpp for some reason. so I did my calculations based on that. well I did some more reading just now and found mulitple sources saying that no it IS 12BPP which means the slow convert is happening at 24BPP which makes WAAAAAY more sense because that means the chroma subsampling conversion just isn't happening at all.
Now that I have switched to the beta client of Steam (on host side), the situation seems to have changed a bit. Same setup as before, now with beta client as of Aug 31st (1661985519):
Results: @ 1920x1080 proton: ~50 fps capture: 7.27 convert: 14.08 encode: 5.79 network: 2.24 decode: 0.37 display: 0.56 (in ms, avg) @ 1680x1050 proton: ~45 fps capture: 8.37 convert: 15.58 encode: 6.43 network: 2.59 decode: 0.37 display: 0.56 (in ms, avg) @ 1280x800 proton: ~60 fps capture: 7.67 convert: 9.42 encode: 4.57 network: 2.22 decode: 0.36 display: 0.53 (in ms, avg) @ 1280x720 proton: ~60 fps capture: 7.04 convert: 8.78 encode: 3,46 network: 2.15 decode: 0.34 display: 0.52 (in ms, avg)
The conversion and encoding figures have clearly improved, but it's still not enough for higher resolutions (note that the1920x1080 value seems to be an outlier; it's because the capture resolution is set to 1680x946, which seems kind of odd to me).
Apparently, some effort is being spent on this issue on Valve's side, so here's hoping that the situation will improve further. It would be good the hear some feedback from a Valve representative...
Replying to https://github.com/ValveSoftware/steam-for-linux/issues/8423#issuecomment-1236163113
So I just tested out Stray streaming from Arch Linux and I also saw some SIGNIFICANT improvement. Kinda wish this was in any of the patch notes for release.
EDIT:
I'm still having this issue with resolutions above 1280x800. Streaming to my Steam Deck works fine, streaming to my steam link at 1920x1080 llimits me to about 45 frames per second with an unplayable 80 ms delay.
Yup I have noticed improvement too on the hardware steam link (beta on both host and client) in the last week or so. Can't say if it's perfect or not, but it seems much more playable. Since there aren't patch notes or official responses I'm gonna say @urmamasllama and @Goggo66 fixed it :)
Can confirm that the issue is fixed for me as well. It is now streaming at 60fps.
steam client beta changelog lists improvement: https://steamcommunity.com/groups/SteamClientBeta/announcements/detail/3380537195892063927 have not had time to test it yet though
Your system information
Games running with Proton lock Remote Play to 30 fps (+10/-3 interval). Tested on Steam Link for RaspberryPi and AppleTV
Steps for reproducing this issue:
Both RaspberryPi and AppleTV behave exactly the same.