Bumblebee-Project / Bumblebee

Bumblebee daemon and client rewritten in C
http://www.bumblebee-project.org/
GNU General Public License v3.0
1.29k stars 142 forks source link

No GLX on primary X with nvidia & bumblebee #907

Open aostruszka opened 7 years ago

aostruszka commented 7 years ago

This is probably yet another configuration problem (on Ubuntu 16.04). This might be similar to other open issues but I do not have problems with running programs via "optirun" but with normal(primary) X server.

My intended use is CUDA and I had this working some time ago. Now after some updates I've noticed that I've been switched to nvidia as primary so I wanted to go back to the desired setup:

So what I've done is that I purged all nvidia / bumblebee / primus stuff rebooted and reinstalled everything from scratch (I'm using 381 since that was the version I had working and is new enough so that CUDA-8 is not complaining).

$ dpkg -l \*nvidia\* bumblebee\* primus | grep -v '^un'
Desired=Unknown/Install/Remove/Purge/Hold
| Status=Not/Inst/Conf-files/Unpacked/halF-conf/Half-inst/trig-aWait/Trig-pend
|/ Err?=(none)/Reinst-required (Status,Err: uppercase=bad)
||/ Name                             Version                    Architecture Description
+++-================================-==========================-============-===============================================
ii  bumblebee                        3.2.1-100~xenialppa1       amd64        NVIDIA Optimus support for Linux
ii  nvidia-381                       381.22-0ubuntu0~gpu16.04.2 amd64        NVIDIA binary driver - version 381.22
ii  nvidia-opencl-icd-381            381.22-0ubuntu0~gpu16.04.2 amd64        NVIDIA OpenCL ICD
ii  nvidia-prime                     0.8.2                      amd64        Tools to enable NVIDIA's Prime
ii  nvidia-settings                  384.66-0ubuntu1            amd64        Tool for configuring the NVIDIA graphics driver

I have updated bumblebee config to point to nvidia-381 paths, switched to intel via prime-select and with this I am able to use nvidia card:

$ optirun glxinfo | grep -i opengl
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GT 740M/PCIe/SSE2
OpenGL core profile version string: 4.5.0 NVIDIA 381.22
OpenGL core profile shading language version string: 4.50 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 4.5.0 NVIDIA 381.22
OpenGL shading language version string: 4.50 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)
OpenGL extensions:

Xorg.8.log shows correct glx extension being loded

$ grep -i glx -A 5 /var/log/Xorg.8.log
[  1403.315] (II) "glx" will be loaded by default.
[  1403.315] (II) LoadModule: "glx"
[  1403.315] (II) Loading /usr/lib/nvidia-381/xorg/libglx.so
[  1403.321] (II) Module glx: vendor="NVIDIA Corporation"
[  1403.321]    compiled for 4.0.2, module version = 1.0.0
[  1403.321]    Module class: X.Org Server Extension
[  1403.321] (II) NVIDIA GLX Module  381.22  Thu May  4 00:17:15 PDT 2017
[  1403.321] (II) LoadModule: "nvidia"
[  1403.321] (II) Loading /usr/lib/nvidia-381/xorg/nvidia_drv.so
[  1403.321] (II) Module nvidia: vendor="NVIDIA Corporation"
[  1403.321]    compiled for 4.0.2, module version = 1.0.0
[  1403.322]    Module class: X.Org Video Driver
--
[  1403.765] (II) Initializing extension GLX
[  1403.765] (II) Indirect GLX disabled.

glxgears show up properly on screen:

$ optirun glxgears
4922 frames in 5.0 seconds = 984.344 FPS
5493 frames in 5.0 seconds = 1098.575 FPS
5157 frames in 5.0 seconds = 1031.189 FPS
5466 frames in 5.0 seconds = 1093.097 FPS
4981 frames in 5.0 seconds = 996.125 FPS

and utilities from CUDA examples correctly detect my card (when run via optirun).

So in theory this is a success however with the above setup I've lost GLX extension on my primary "intel" server!

$ glxinfo
name of display: :0
Error: couldn't find RGB GLX visual or fbconfig

The Xorg log seems to indicate that the valid extension is being loaded:

$ grep -i glx -A 5 /var/log/Xorg.0.log
[    36.122] (II) "glx" will be loaded by default.
[    36.122] (II) LoadModule: "glx"
[    36.122] (II) Loading /usr/lib/xorg/modules/extensions/libglx.so
[    36.123] (II) Module glx: vendor="X.Org Foundation"
[    36.123]    compiled for 1.18.4, module version = 1.0.0
[    36.123]    ABI class: X.Org Server Extension, version 9.0
[    36.123] (==) AIGLX enabled
[    36.123] (==) Matched intel as autoconfigured driver 0
[    36.123] (==) Matched intel as autoconfigured driver 1
[    36.123] (==) Matched modesetting as autoconfigured driver 2
[    36.123] (==) Matched fbdev as autoconfigured driver 3
[    36.123] (==) Matched vesa as autoconfigured driver 4
--
[    36.162] (II) AIGLX: enabled GLX_MESA_copy_sub_buffer
[    36.162] (II) AIGLX: enabled GLX_ARB_create_context
[    36.162] (II) AIGLX: enabled GLX_ARB_create_context_profile
[    36.162] (II) AIGLX: enabled GLX_EXT_create_context_es{,2}_profile
[    36.162] (II) AIGLX: enabled GLX_INTEL_swap_event
[    36.162] (II) AIGLX: enabled GLX_SGI_swap_control and GLX_MESA_swap_control
[    36.162] (II) AIGLX: enabled GLX_EXT_framebuffer_sRGB
[    36.162] (II) AIGLX: enabled GLX_ARB_fbconfig_float
[    36.162] (II) AIGLX: enabled GLX_EXT_fbconfig_packed_float
[    36.162] (II) AIGLX: GLX_EXT_texture_from_pixmap backed by buffer objects
[    36.162] (II) AIGLX: enabled GLX_ARB_create_context_robustness
[    36.162] (II) AIGLX: Loaded and initialized i965
[    36.162] (II) GLX: Initialized DRI2 GL provider for screen 0

I've seen some issues with glx being overwritten by nvidia but this seems to be not the case:

$ dpkg -S /usr/lib/xorg/modules/extensions/libglx.so
xserver-xorg-core: /usr/lib/xorg/modules/extensions/libglx.so
$ debsums xserver-xorg-core | grep glx
/usr/lib/xorg/modules/extensions/libglx.so                                    OK

Does anybody know how to fix this problem? It e.g. prevents Unity from being loaded (I enter login loop - and I had to switch to awesome in order to log this issue).

I'd appreciate any help Best regards

kranzo commented 7 years ago

I can confirm this Problem for Ubuntu 16.04.

--> uname -a
Linux XXX 4.4.0-101-generic #124-Ubuntu SMP Fri Nov 10 18:29:59 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
--> lspci | egrep "VGA|3D|Display"
00:02.0 VGA compatible controller: Intel Corporation Skylake Integrated Graphics (rev 06)
01:00.0 3D controller: NVIDIA Corporation GM107M [GeForce GTX 960M] (rev ff)
-->  optirun glxinfo | grep -i opengl
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GTX 960M/PCIe/SSE2
OpenGL core profile version string: 4.5.0 NVIDIA 384.98
OpenGL core profile shading language version string: 4.50 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 4.5.0 NVIDIA 384.98
OpenGL shading language version string: 4.50 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)
OpenGL extensions:

While testing a lot of stuff i ran into the RGB error but can't reproduce it anymore but got a new one:

name of display: :0
X Error of failed request:  BadValue (integer parameter out of range for operation)
  Major opcode of failed request:  154 (GLX)
  Minor opcode of failed request:  24 (X_GLXCreateNewContext)
  Value in failed request:  0x0
  Serial number of failed request:  35
  Current serial number in output stream:  36

CUDA is working properly with optirun

Any ideas appreciated

kranzo commented 7 years ago

Extra information:

--> ldd $(which glxinfo)
    linux-vdso.so.1 =>  (0x00007ffda9fed000)
    libGL.so.1 => /usr/lib/nvidia-384/libGL.so.1 (0x00007f18711e8000)
    libX11.so.6 => /usr/lib/x86_64-linux-gnu/libX11.so.6 (0x00007f1870eae000)
    libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f1870ae4000)
    libnvidia-tls.so.384.98 => /usr/lib/nvidia-384/tls/libnvidia-tls.so.384.98 (0x00007f18708e0000)
    libnvidia-glcore.so.384.98 => /usr/lib/nvidia-384/libnvidia-glcore.so.384.98 (0x00007f186ea24000)
    libXext.so.6 => /usr/lib/x86_64-linux-gnu/libXext.so.6 (0x00007f186e812000)
    libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f186e60e000)
    libxcb.so.1 => /usr/lib/x86_64-linux-gnu/libxcb.so.1 (0x00007f186e3ec000)
    /lib64/ld-linux-x86-64.so.2 (0x00007f187152a000)
    libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f186e0e3000)
    libXau.so.6 => /usr/lib/x86_64-linux-gnu/libXau.so.6 (0x00007f186dedf000)
    libXdmcp.so.6 => /usr/lib/x86_64-linux-gnu/libXdmcp.so.6 (0x00007f186dcd9000)
--> LD_PRELOAD=/usr/lib/x86_64-linux-gnu/mesa/libGL.so glxinfo |head -5
name of display: :0
display: :0  screen: 0
direct rendering: Yes
server glx vendor string: SGI
server glx version string: 1.4
--> LD_PRELOAD=/usr/lib/x86_64-linux-gnu/mesa/libGL.so ldd $(which glxinfo)
    linux-vdso.so.1 =>  (0x00007ffecfdf7000)
    /usr/lib/x86_64-linux-gnu/mesa/libGL.so (0x00007fc971598000)
    libX11.so.6 => /usr/lib/x86_64-linux-gnu/libX11.so.6 (0x00007fc97125e000)
    libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fc970e94000)
    libexpat.so.1 => /lib/x86_64-linux-gnu/libexpat.so.1 (0x00007fc970c6b000)
    libxcb-dri3.so.0 => /usr/lib/x86_64-linux-gnu/libxcb-dri3.so.0 (0x00007fc970a68000)
    libxcb-present.so.0 => /usr/lib/x86_64-linux-gnu/libxcb-present.so.0 (0x00007fc970865000)
    libxcb-sync.so.1 => /usr/lib/x86_64-linux-gnu/libxcb-sync.so.1 (0x00007fc97065e000)
    libxshmfence.so.1 => /usr/lib/x86_64-linux-gnu/libxshmfence.so.1 (0x00007fc97045b000)
    libglapi.so.0 => /usr/lib/x86_64-linux-gnu/libglapi.so.0 (0x00007fc97022c000)
    libXext.so.6 => /usr/lib/x86_64-linux-gnu/libXext.so.6 (0x00007fc97001a000)
    libXdamage.so.1 => /usr/lib/x86_64-linux-gnu/libXdamage.so.1 (0x00007fc96fe17000)
    libXfixes.so.3 => /usr/lib/x86_64-linux-gnu/libXfixes.so.3 (0x00007fc96fc11000)
    libX11-xcb.so.1 => /usr/lib/x86_64-linux-gnu/libX11-xcb.so.1 (0x00007fc96fa0f000)
    libxcb-glx.so.0 => /usr/lib/x86_64-linux-gnu/libxcb-glx.so.0 (0x00007fc96f7f6000)
    libxcb-dri2.so.0 => /usr/lib/x86_64-linux-gnu/libxcb-dri2.so.0 (0x00007fc96f5f1000)
    libxcb.so.1 => /usr/lib/x86_64-linux-gnu/libxcb.so.1 (0x00007fc96f3cf000)
    libXxf86vm.so.1 => /usr/lib/x86_64-linux-gnu/libXxf86vm.so.1 (0x00007fc96f1c9000)
    libdrm.so.2 => /usr/lib/x86_64-linux-gnu/libdrm.so.2 (0x00007fc96efb8000)
    libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fc96ecaf000)
    libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fc96ea92000)
    libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fc96e88e000)
    /lib64/ld-linux-x86-64.so.2 (0x00007fc97180a000)
    libXau.so.6 => /usr/lib/x86_64-linux-gnu/libXau.so.6 (0x00007fc96e68a000)
    libXdmcp.so.6 => /usr/lib/x86_64-linux-gnu/libXdmcp.so.6 (0x00007fc96e484000)

so libGL points still to nvidia instead of mesa