Open dcommander opened 8 years ago
Hello @dcommander, do you happen to know if adding LD_PRELOAD="${LD_PRELOAD-}:libdlfaker.so:libvglfaker.so" %command%
to a game's launch options with no other workarounds also works for you?
Unfortunately not. The problem is that, in order for Steam to work properly in a remote display environment, both the Steam client itself and any games it launches need to have VirtualGL preloaded into them. So doing the above produces an LD_PRELOAD
value of:
libdlfaker.so:libvglfaker.so:/home/drc/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so:/home/drc/.local/share/Steam/ubuntu12_64/gameoverlayrenderer.so:libdlfaker.so:libvglfaker.so
In order for the games to work properly, the value of LD_PRELOAD
passed to them needs to be:
/home/drc/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so:/home/drc/.local/share/Steam/ubuntu12_64/gameoverlayrenderer.so:libdlfaker.so:libvglfaker.so
It's probably possible to produce another command that, when added to the game's launch options, manipulates LD_PRELOAD
in such a way that this can be achieved, but ideally we need something that is globally configurable for all games. Modifying the settings for each game individually isn't a very user-friendly solution.
Note:
Setting the launch options to
LD_PRELOAD="${LD_PRELOAD/libdlfaker.so:libvglfaker.so:/}:libdlfaker.so:libvglfaker.so" %command%
does fully work around the issue, although this is again not very user-friendly. An acceptable solution would be to allow those launch options to be set globally for all Steam games, but I would still prefer that Steam either changes its behavior vis-a-vis setting the LD_PRELOAD
variable or that it allows that behavior to be configured with another environment variable. That would allow for working around this problem on a system-wide basis rather than a per-user basis.
Maybe you could ask amonakov here on github about this. Seems like his area of expertise.
Hi, Any updates about the argument? I have the same issue... Thanks
All right I have been trying to get to get vgl to work with steam games for 3 days now and know matter what order I pass libs in it all was fails with. Then lunches the game on the server. I just don't know what to do now.
ERROR: ld.so: object 'libdlfaker.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS64): ignored.
ERROR: ld.so: object 'libvglfaker.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS64): ignored.
Hello @ashlin4010, those messages are hinting that you are trying to use 64 bit libraries with a 32 bit game. It would be a good idea to try the 32 bit variants of libdlfaker.so and libvglfaker.so for the particular game you are testing.
@ashlin4010, you need to install both the 64-bit and 32-bit VirtualGL packages. Installing just the 64-bit VGL package will only allow you to run 64-bit OpenGL applications.
ok I have fixed the problem or at least there are no more errors now. I was dong some like this
LD_PRELOAD=/home/drc/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so:/home/drc/.local/share/Steam/ubuntu12_64/gameoverlayrenderer.so:libdlfaker.so:libvglfaker.so
This gave me the error.
ERROR: ld.so: object 'libdlfaker.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS64): ignored.
ERROR: ld.so: object 'libvglfaker.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS64): ignored.
So I did the same thing as be for but with the 32 bit variants of libdlfaker.so and libvglfaker.so and it have me something like.
ERROR: ld.so: object 'libdlfaker.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.
ERROR: ld.so: object 'libvglfaker.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.
then I passed them both that did not work so I changed the order still nothing so I then did
LD_PRELOAD=""
and it work! No more errors. The game starts ... and it's on the servers screen.
Now I have to find out why this is happening it does not happen with other programs.
Stuff that works: Kerbal space program, Starbound, glxgears, Terraria, gnome-system-monitor Games the work but on the wrong screen: Portal, Bioshock infinite
Something I just found when wriging this is that when I run portal with ./start.sh
(file to start the game) from the client I get
X Error of failed request: BadValue (integer parameter out of range for operation)
Major opcode of failed request: 154 (GLX)
Minor opcode of failed request: 3 (X_GLXCreateContext)
Value in failed request: 0x0
Serial number of failed request: 87
Current serial number in output stream: 88
and when I run vglrun ./start.sh
form the client I got
[VGL] NOTICE: Automatically setting VGL_CLIENT environment variable to
[VGL] 192.168.1.2, the IP address of your SSH client.
X Error of failed request: BadValue (integer parameter out of range for operation)
Major opcode of failed request: 154 (GLX)
Minor opcode of failed request: 3 (X_GLXCreateContext)
Value in failed request: 0x0
Serial number of failed request: 87
Current serial number in output stream: 88
but runing ./start.sh
on the server no problem.
also vglrun ./start.sh
on the server no problem.
@ashlin4010 you should replace "/home/drc" with the path to your home directory, rather than literally copying that from my examples above. Those examples were specific to my machine.
Note my updated comment, in which I recommend setting the launch options to:
LD_PRELOAD="${LD_PRELOAD/libdlfaker.so:libvglfaker.so:/}:libdlfaker.so:libvglfaker.so" %command%
I shod have said that the path in comment above is just an example I did use the proper paths I was just on some other computer at the time and did not know the paths of the top of my head.
Any updates? I was able to play dota 2 with that work around(it was still crashing at start of some matches, but I could work around that by restarting the game and reconnecting) 2 weeks ago, but not anymore... The game crashes(or the steam crashes and causes the game to crash) after a second it is loaded... Sometimes even few seconds. So it loads, starts, plays the music and then, after a second or few seconds it crashes(just before fetching profile stats). I tried disabling and enabling steam overlay, but it seems to have no effect. I wanna play dota again using my headless server. I have paid for the battle passes, dota plus and other in-game items in the past and I have almost 3000 hours in-game! My Steam ID is STEAM_0:0:58779126
Looks like it is not directly related to Dota 2. It is the steam client that crashes. It was not crashing around 2 weeks ago with VGL. I just opened steam client using simple script and it failed after some time when I just navigated through my games with following errors:
CAPIJobRequestUserStats - Server response failed 2 Generating new string page texture 152: 256x256, total string texture memory is 3.98 MB crash_20180730113057_1.dmp[6714]: Uploading dump (out-of-process) /tmp/dumps/crash_20180730113057_1.dmp /home/nick/.steam/steam/steam.sh: line 876: 6532 Segmentation fault crash_20180730113057_1.dmp[6714]: Finished uploading minidump (out-of-process): success = yes crash_20180730113057_1.dmp[6714]: response: CrashID=bp-71121900-da37-411d-b367-fdfe22180730 crash_20180730113057_1.dmp[6714]: file ''/tmp/dumps/crash_20180730113057_1.dmp'', upload yes: ''CrashID=bp-71121900-da37-411d-b367-fdfe22180730''
I can look into it again, but unfortunately, there's a hard limit to what I can do to work around problems like this. Steam really needs to test their software with VirtualGL. VGL has been around for 14 years and is shipping in various Linux distros, so I don't think it's too much to ask that Linux OpenGL ISVs test against it.
Here is the contents of my script: $ cat ./vglsteam.sh export VGL_COMPRESS=jpeg export VGL_SUBSAMP=4x export VGL_SPOIL=0 export PULSE_SERVER=192.168.10.200 cd ~/.steam/steam vglrun -np 4 -fps 60 ./steam.sh
Everything else is set to its default values. To run the steam I connect to the machine with vglconnect and then run that script.
Not sure why you're disabling frame spoiling (frame spoiling is your friend, as it improves responsiveness/reduces latency on slow connections), and activating VGL's frame rate governor is probably pointless, since the game engines are probably regulating the frame rate already. Otherwise that's a pretty standard invocation.
Ubuntu 16.04 can't use this workaround because of bash issue. A workaround for Ubuntu 16.04 is to set the launch options to:
LD_PRELOAD="${LD_PRELOAD#libdlfaker.so:libvglfaker.so:}:libdlfaker.so:libvglfaker.so" %command%
This launch option works well for dota2.
Someone from Valve could specify if and how this preload variable can be generalized for all the games? It is always required if you want to run steam with bumblebee, especially if you run in BigPIcture mode. Since the overlay is injected when launching games this variable can't be set in a wrapper script or in front of the optirun command.
Any progress on supporting optirun %command%
as a custom game runner in Linux Steam client?
Using the LD_PRELOAD
that @wereHuang provided seems to work, although I still see these messages when launching a game:
ERROR: ld.so: object '/home/michael/.steam/debian-installation/ubuntu12_64/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS64): ignored.
ERROR: ld.so: object '/home/michael/.steam/debian-installation/ubuntu12_32/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.
These errors are specific to the way steam is trying to load things and they can be safely ignored.
I know this is a bit off topic, but I felt the need to share that 4x and spoil were both problematic for me, and gamma needed to be adjusted because of a weird display problem. I ended up creating two basic wrapper scripts ...
~/.bin/vglcon
to connect to a host (vglcon earpiercer.domain.tld):
#!/bin/sh
[ -z "$1" ] && echo 'hostname required' && exit 1
scp ~/.pulse-cookie "$1:."
vglconnect -s toast.lustfield.net
~/.bin/vglsteam
to launch steam with the IP address of my connected client:
#!/bin/sh
# Note: This script looks for itself and expects to be named "vglsteam"
export PULSE_SERVER="$(w -h | awk '/vglsteam/{print $3}')"
cd ~/.steam/steam && vglrun -gamma 1.2 ./steam.sh
@MTecknology -c xv
activates the X Video Transport in VirtualGL, which was designed for the obsolete Sun Ray platform. We may eventually repurpose that feature, but for the moment, I don't recommend using it. If you want to use a client-side X server, as opposed to a server-side X proxy, then the VGL Transport is your best bet. The X Video Transport doesn't support 1X subsampling or multithreaded encoding, so -samp 1x
and -np 4
won't have any effect in the command line above. Not sure why gamma correction is necessary on your system. The games might be using the GLX_EXT_framebuffer_sRGB
extension, which has a different gamma than "normal" OpenGL rendering, but that extension is fully supported in VirtualGL.
Ah, lovely ... the joys of copy/pasting things you don't understand. I played around with some of the options and then realized I should just stick with defaults and add one option at a time. I updated my comment to reflect what I'm actually using now. Thanks for the sanity check! :)
No problem. It's really odd that the gamma needs to be corrected by a factor of 1.2. 2.2 would be easier to believe, since that is the gamma of the sRGB colorspace. Systems that don't have proper support for the sRGB GLX and OpenGL extensions would need gamma correction by a factor of 2.2, but since sRGB rendering is part of the OpenGL spec now, such systems would not be OpenGL-conformant. It may very well be a bug or an oversight in VirtualGL as well, but I can't imagine what kind of a bug or oversight would require gamma correction by a factor of 1.2. Maybe Steam is gamma correcting in a way that VirtualGL doesn't properly handle? No one else has reported that issue.
It's apparently not a problem with steam and just a handful of games, which are the ones I looked at (they all have some forum complaints about darkness). Now I feel like I hijacked an issue for more support requests ... Thanks again!
Your system information
Please describe your issue in as much detail as possible:
VirtualGL, for those who might not know, is an open source tool that automatically modifies the GLX/OpenGL command stream from Un*x OpenGL applications at run time to force all of the 3D rendering from those applications into off-screen Pbuffers, thus allowing the applications to work properly (and with 3D hardware acceleration) within remote display environments such as VNC or NX. VGL is widely used within the open source community, and it also forms the basis of some commercial Linux remote display products.
Referring to the comprehensive description of the problem at https://github.com/VirtualGL/virtualgl/issues/25, Steam takes the existing contents of the
LD_PRELOAD
environment variable and adds its own gameoverlayrenderer.so interposers to the end of that environment variable. This causes problems with VirtualGL, for a couple of reasons:dlsym()
function within gameoverlayrenderer.so does not seem to work properly, so when VirtualGL attempts to load functions from libGL, gameoverlayrenderer.so intercepts thosedlsym()
orglXGetProcAddress()
calls and gives VirtualGL pointers to VirtualGL's own interposed functions instead. This causes VGL's interposed functions to call themselves in an infinite recursion loop, leading to stack exhaustion and a segfault.glXSwapBuffers()
function within gameoverlayrenderer.so does not seem to handle any drawables other than Windows.More generally, gameoverlayrenderer.so expects that the calls made to it will be from a Steam game, so when VirtualGL modifies the GLX/OpenGL calls to support remote display, this produces a call sequence that gameoverlayrenderer.so doesn't like. Similarly, VGL expects to be able to make calls directly to libGL, so when gameoverlayrenderer.so modifies the
dlsym()
orglXGetProcAddress()
calls, VirtualGL doesn't like it. https://github.com/VirtualGL/virtualgl/issues/25 goes into gory detail of the things I tried in an attempt to work around the problem within VirtualGL.The ultimate solution is simply to reverse the order of
LD_PRELOAD
, placing the gameoverlayrenderer.so interposers at the head of that environment variable and placing any existingLD_PRELOAD
contents at the end. However, since this environment variable seems to be set within some binary part of Steam (as opposed to within a hackable script), currently the only way to do this is to modify the launch scripts for each game that is affected by this issue (and, unfortunately, some games affected by it don't have launch scripts.)The Steam client should either place its gameoverlayrenderer.so interposers at the beginning of
LD_PRELOAD
rather than at the end, or it should provide another environment variable to allow this behavior to be configured.Steps for reproducing this issue:
vglrun steam
Expected behavior: The game, as well the the in-game overlay, work properly.
Actual behavior: A segfault occurs, for reasons described at https://github.com/VirtualGL/virtualgl/issues/25.
Nasty hack to work around it: Adding
to the top of ~/.local/share/Steam/steamapps/common/dota 2 beta/game/dota.sh works around the problem, but this isn't a general-purpose solution, since not all games have launch scripts, and these modifications probably wouldn't survive an update of the games in question.