Open Samsagax opened 2 years ago
I have a similar issue.
linux 5.17.9.arch1-1
nvidia-dkms 515.43.04-1
mesa 22.1.0-1
gamescope-git 3.11.31.beta5.r0.ga515153-1
I have:
The Nvidia card has no physical outputs so its buffer has to be copied to the Intel one that has a display attached. This works on both X and Wayland Gnome sessions, but when I try to run a game with gamescope -- %command%
the game runs and I do hear sound but I don't get a window.
Using glxgears
I get the same error on vkGetPhysicalDeviceFormatProperties2()
and no window. The Nvidia card is being used and it's confirmed by nvidia-smi
as well.
The output of gamescope -- glxgears
is:
No CAP_SYS_NICE, falling back to regular-priority compute and threads.
Performance will be affected.
vulkan: selecting physical device 'NVIDIA P106-100'
vulkan: physical device supports DRM format modifiers
vulkan: vkGetPhysicalDeviceFormatProperties2 returned zero modifiers for DRM format 0x3231564E (VkResult: 0)
vulkan: supported DRM formats for sampling usage:
vulkan: 0x34325241
vulkan: 0x34325258
wlserver: [backend/headless/backend.c:82] Creating headless backend
wlserver: Running compositor on wayland display 'gamescope-0'
wlserver: [backend/headless/backend.c:18] Starting headless backend
wlserver: [xwayland/server.c:92] Starting Xwayland on :2
wlserver: [types/wlr_surface.c:741] New wlr_surface 0x562d3b8500e0 (res 0x562d3b824300)
wlserver: [xwayland/server.c:250] Xserver is ready
pipewire: stream state changed: connecting
pipewire: stream state changed: paused
pipewire: stream available on node ID: 59
pipewire: renegotiating stream params (size: 1280x720)
wlserver: [types/wlr_surface.c:741] New wlr_surface 0x562d3ae50df0 (res 0x562d3b849b70)
Error getting buffer
Running synchronized to the vertical refresh. The framerate should be
approximately the same as the monitor refresh rate.
2010 frames in 5.0 seconds = 401.809 FPS
2022 frames in 5.0 seconds = 404.237 FPS
^C
gamescope: received kill signal, terminating!
xwm: Lost connection to the X11 server 0
xwm: X11 I/O error
I assume that it's not a bug and it's just an unsupported use case.
I have the same issue, but i'm on a discrete RTX3080 card and with the latest nvidia-dkms-tkg 515.48.07-209 drivers.
linux 5.18.1.arch1-1
nvidia-dkms-tkg 515.48.07-209
mesa 22.1.1-1
gamescope-git 3.11.31.beta6.r0.gb1c3859-1
Here is my output disabling mangohud
❯ MANGOHUD=0 gamescope glxgears
No CAP_SYS_NICE, falling back to regular-priority compute and threads.
Performance will be affected.
vulkan: selecting physical device 'NVIDIA GeForce RTX 3080': queue family 2
vulkan: physical device supports DRM format modifiers
vulkan: vkGetPhysicalDeviceFormatProperties2 returned zero modifiers for DRM format 0x3231564E (VkResult: 0)
vulkan: supported DRM formats for sampling usage:
vulkan: 0x34325241
vulkan: 0x34325258
wlserver: [backend/headless/backend.c:82] Creating headless backend
wlserver: Running compositor on wayland display 'gamescope-0'
wlserver: [backend/headless/backend.c:18] Starting headless backend
wlserver: [xwayland/sockets.c:63] Failed to bind socket @/tmp/.X11-unix/X0: Address already in use
wlserver: [xwayland/server.c:92] Starting Xwayland on :1
wlserver: [types/wlr_surface.c:741] New wlr_surface 0x55d051649c70 (res 0x55d051625960)
wlserver: [xwayland/server.c:250] Xserver is ready
pipewire: stream state changed: connecting
pipewire: stream state changed: paused
pipewire: stream available on node ID: 35
Running synchronized to the vertical refresh. The framerate should be
approximately the same as the monitor refresh rate.
pipewire: renegotiating stream params (size: 1280x720)
wlserver: [types/wlr_surface.c:741] New wlr_surface 0x55d0515062d0 (res 0x55d0516488d0)
Error getting buffer
19990 frames in 5.0 seconds = 3997.928 FPS
19927 frames in 5.0 seconds = 3985.303 FPS
Similar issue, but I'm running X on my 1070 eGPU via nvidia-xrun
, using the eGPU output. Arch linux.
nvidia 515.48.07-8
linux 5.18.2.arch1-1
gamescope-git 3.11.31.beta4.r10.ga515153-1
similar situation with glxgears and games; sound but no window appears.
% gamescope glxgears
No CAP_SYS_NICE, falling back to regular-priority compute and threads.
Performance will be affected.
vulkan: selecting physical device 'NVIDIA GeForce GTX 1070'
vulkan: physical device supports DRM format modifiers
vulkan: vkGetPhysicalDeviceFormatProperties2 returned zero modifiers for DRM format 0x3231564E (VkResult: 0)
vulkan: supported DRM formats for sampling usage:
vulkan: 0x34325241
vulkan: 0x34325258
wlserver: [backend/headless/backend.c:82] Creating headless backend
wlserver: Running compositor on wayland display 'gamescope-0'
wlserver: [backend/headless/backend.c:18] Starting headless backend
wlserver: [xwayland/server.c:92] Starting Xwayland on :1
wlserver: [types/wlr_surface.c:741] New wlr_surface 0x55acb9fb6b10 (res 0x55acb9fd2450)
wlserver: [xwayland/server.c:250] Xserver is ready
pipewire: stream state changed: connecting
pipewire: stream state changed: paused
pipewire: stream available on node ID: 129
Running synchronized to the vertical refresh. The framerate should be
approximately the same as the monitor refresh rate.
pipewire: renegotiating stream params (size: 1280x720)
wlserver: [types/wlr_surface.c:741] New wlr_surface 0x55acb9370f30 (res 0x55acb9fd9660)
Error getting buffer
7836 frames in 5.0 seconds = 1566.939 FPS
New test that works both for vulkan and glx applications on the nested case. Seems like there needs to be a lot of information given to the stack:
$ DRI_PRIME=1 MESA_VK_DEVICE_SELECT=10de:1347 __NV_PRIME_RENDER_OFFLOAD=1 __VK_LAYER_NV_optimus=NVIDIA_only __GLX_VENDOR_LIBRARY_NAME=nvidia gamescope -- glxgears
Seems like DRI_PRIME=1
neds to be set in addition to MESA_VK_DEVICE_SELECT=<vendor-id>:<device-id>
. If any of the __*
env vars are missing then intel iGPU is used to present the window and #356 will arise.
Hope this helps debugging the issue. Couldn't test the embedded case still Embedded gamescope doesn't work at all.
Can confirm, setting the provided variables does resolve the issue. I had gotten out of the habit of using prime-run
since switching to Wayland, but I suppose I will need to start adding it to my gamescope instances.
This doesn't fix the issue for me, unfortunately -- I'm not using the hybrid drivers (eGPU) perhaps that's the difference? Still runs with no window appearing
DRI_PRIME=1 MESA_VK_DEVICE_SELECT=10de:1b81 __NV_PRIME_RENDER_OFFLOAD=1 __VK_LAYER_NV_optimus=NVIDIA_only __GLX_VENDOR_LIBRARY_NAME=nvidia gamescope -- glxgears
No CAP_SYS_NICE, falling back to regular-priority compute and threads.
Performance will be affected.
vulkan: selecting physical device 'NVIDIA GeForce GTX 1070': queue family 2
vulkan: physical device supports DRM format modifiers
vulkan: vkGetPhysicalDeviceFormatProperties2 returned zero modifiers for DRM format 0x3231564E (VkResult: 0)
vulkan: supported DRM formats for sampling usage:
vulkan: 0x34325241
vulkan: 0x34325258
wlserver: [backend/headless/backend.c:82] Creating headless backend
wlserver: [wayland] unable to lock lockfile /run/user/1000/gamescope-0.lock, maybe another compositor is running
wlserver: Running compositor on wayland display 'gamescope-1'
wlserver: [backend/headless/backend.c:18] Starting headless backend
wlserver: [xwayland/server.c:92] Starting Xwayland on :2
wlserver: [types/wlr_surface.c:741] New wlr_surface 0x55adc577ee60 (res 0x55adc577e680)
wlserver: [xwayland/server.c:250] Xserver is ready
pipewire: stream state changed: connecting
pipewire: stream state changed: paused
pipewire: stream available on node ID: 124
Running synchronized to the vertical refresh. The framerate should be
approximately the same as the monitor refresh rate.
pipewire: renegotiating stream params (size: 1280x720)
wlserver: [types/wlr_surface.c:741] New wlr_surface 0x55adc563f890 (res 0x55adc5783920)
Error getting buffer
3668 frames in 5.0 seconds = 733.349 FPS
Not sure if this is the root cause, I am experiencing this issue, but only in embedded mode. I do have an optimus laptop, and I've tried all the env vars mentioned here with no success.
I am not sure if the vulkan backend is actually making use of the upstream wlroots vulkan renderer, but if so that might be related since I am not able to launch other wlroots compositors with the vulkan renderer on my system either.
I know vulkan drm is not fundamentally broken because I can launch retroarch with its vulkan driver from a VT just fine.
Note that I can lauch gamescope fine from within a wayland session (without any special variables no less), but I'd preferably like to use the embedded mode for the latency benefits.
Just tested this commit but I still need to specify MESA_VK_DEVICE_SELECT=10de:1347
for it to work. Seems to be some others parts of the stack that need to be signaled. Here is the output with and without the env variable set.
gamescope-stdout-env-not-set.log gamescope-stdout-env-set.log
When not set it seems to be trying to still load i915 as device as shown by this line:
MESA-INTEL: warning: Performance support disabled, consider sysctl dev.i915.perf_stream_paranoid=0
Tried also running without any of the env variables set ($ build/gamescope --prefer-vk-device=10de:1347 -- vkcube
) and got the same result as a missing MESA_VK_DEVICE_SELECT
Any update on this? I have been wanting to try Gamescope for a very long time
Any update on this? I have been wanting to try Gamescope for a very long time
my findings here may very well be of use to you #560
Any update on this? I have been wanting to try Gamescope for a very long time
my findings here may very well be of use to you #560
No luck unfortunately, I tried in Bottles to run as a launch command
gamescope --disable-layers -w 1280 -h 720 %command%
and I can see gamescope in KDE system monitor and can hear the game like before but still no visual output.
Having the same issue on my 2070 on my desktop PC (so without Optimus)
Same here Nvidia 3080 Edit: Sorry my fault. DRM driver wasn't enabled - works!
update?
Ok... new findings on my end on the embedded case:
My setup
Intel doesn't work on current mesa stable so I needed to use mesa-git with some fixes from here: https://github.com/ruineka/mesa-22.3.0/releases
After that was installed, my session script worked only affected by #356 (used INTEL_DEBUG=noccs
to mitigate it but performance is terrible) on Intel Graphics.
To get Nvidia to be used I needed to add this environment variables:
DRI_PRIME=1
MESA_VK_DEVICE_SELECT=10de:1347
__NV_PRIME_RENDER_OFFLOAD=1
__VK_LAYER_NV_optimus=NVIDIA_only
__GLX_VENDOR_LIBRARY_NAME=nvidia
GAMESCOPECMD="/usr/bin/gamescope \
-e \
--xwayland-count 2 \
-O \'\*\',eDP-1 \
--default-touch-mode 4 \
--hide-cursor-delay 3000 \
--fade-out-duration 200 \
"
If any of the variables is missing the session won't launch. Note: I tried to use --prefer-vk-device
but it didn't make any difference. Seems like MESA_VK_DEVICE_SELECT
is talking to other parts of the stack.
And voilá!
Is not usable on any capacity since Intel graphics is an intermediary and all their issues stack up on top of Nvidia issues. For example: color corruption is there, games don't launch (they don't even on intel).
I hope this can be sorted out.
I'm not really sure why everyone seems to be having issues here, but I just wanted to point out that I can use gamescope (in embedded mode at least) quite well with optimus, both on my laptop screen by setting prefer-vk-device
to my intel chip and setting the proper prime variables, or even directly through the nvidia card to an external screen.
For the intel side though, it did require me to manually patch mesa as mentioned elsewhere Here is the code so you can see everything I use to make it work with with prime render offload, principally the args and env vars.
I own an Optimus laptop that has a Skylake i5 CPU and an Nvidia 940M GPU. Using ArchLinux with nvidia latest (proprietary) drivers:
(Edit: nvidia_drm modeset=1 is passed as parameter and is enabled)
I'm trying to make it work on my Nvidia card alone and tested with:
$ MESA_VK_DEVICE_SELECT=10de:1347 gamescope -- glxgears
This is the output on console: gamescope-stdout-nvidia.log
Seems to be working but there is no window drawn, nor any graphical output. There are some lines that confirm the nvidia card is used but seems to fail getting
vkGetPhysicalDeviceFormatProperties2()
:Running just
$ gamescope -- glxgears
produces similar output with no warning about properties and seems to support more DRM formats. The output window is shown and I can interact with it so I closed by pressingESC
(the image is tainted by #356). Console output: gamescope-stdout-intel.log.If I run with Optimus Offloading (on nested mode, doesn't work on embedded) I get a somewhat desired output of the game outputing to a window and
nvidia-smi
shows the process loaded into the GPU memory. (running with:$ __NV_PRIME_RENDER_OFFLOAD=1 __VK_LAYER_NV_optimus=NVIDIA_only __GLX_VENDOR_LIBRARY_NAME=nvidia gamescope -- glxgears
Is there a way to use Nvidia GPU alone for embedded in this cases?
I don' t know if this is a gamescope issue but since you are working with NVIDIA to make it work maybe it is worth to have your input (and also report to Nvidia).