Open aostruszka opened 7 years ago
I can confirm this Problem for Ubuntu 16.04.
--> uname -a
Linux XXX 4.4.0-101-generic #124-Ubuntu SMP Fri Nov 10 18:29:59 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
--> lspci | egrep "VGA|3D|Display"
00:02.0 VGA compatible controller: Intel Corporation Skylake Integrated Graphics (rev 06)
01:00.0 3D controller: NVIDIA Corporation GM107M [GeForce GTX 960M] (rev ff)
--> optirun glxinfo | grep -i opengl
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GTX 960M/PCIe/SSE2
OpenGL core profile version string: 4.5.0 NVIDIA 384.98
OpenGL core profile shading language version string: 4.50 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 4.5.0 NVIDIA 384.98
OpenGL shading language version string: 4.50 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)
OpenGL extensions:
While testing a lot of stuff i ran into the RGB error but can't reproduce it anymore but got a new one:
name of display: :0
X Error of failed request: BadValue (integer parameter out of range for operation)
Major opcode of failed request: 154 (GLX)
Minor opcode of failed request: 24 (X_GLXCreateNewContext)
Value in failed request: 0x0
Serial number of failed request: 35
Current serial number in output stream: 36
CUDA is working properly with optirun
Any ideas appreciated
Extra information:
--> ldd $(which glxinfo)
linux-vdso.so.1 => (0x00007ffda9fed000)
libGL.so.1 => /usr/lib/nvidia-384/libGL.so.1 (0x00007f18711e8000)
libX11.so.6 => /usr/lib/x86_64-linux-gnu/libX11.so.6 (0x00007f1870eae000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f1870ae4000)
libnvidia-tls.so.384.98 => /usr/lib/nvidia-384/tls/libnvidia-tls.so.384.98 (0x00007f18708e0000)
libnvidia-glcore.so.384.98 => /usr/lib/nvidia-384/libnvidia-glcore.so.384.98 (0x00007f186ea24000)
libXext.so.6 => /usr/lib/x86_64-linux-gnu/libXext.so.6 (0x00007f186e812000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f186e60e000)
libxcb.so.1 => /usr/lib/x86_64-linux-gnu/libxcb.so.1 (0x00007f186e3ec000)
/lib64/ld-linux-x86-64.so.2 (0x00007f187152a000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f186e0e3000)
libXau.so.6 => /usr/lib/x86_64-linux-gnu/libXau.so.6 (0x00007f186dedf000)
libXdmcp.so.6 => /usr/lib/x86_64-linux-gnu/libXdmcp.so.6 (0x00007f186dcd9000)
--> LD_PRELOAD=/usr/lib/x86_64-linux-gnu/mesa/libGL.so glxinfo |head -5
name of display: :0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: SGI
server glx version string: 1.4
--> LD_PRELOAD=/usr/lib/x86_64-linux-gnu/mesa/libGL.so ldd $(which glxinfo)
linux-vdso.so.1 => (0x00007ffecfdf7000)
/usr/lib/x86_64-linux-gnu/mesa/libGL.so (0x00007fc971598000)
libX11.so.6 => /usr/lib/x86_64-linux-gnu/libX11.so.6 (0x00007fc97125e000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fc970e94000)
libexpat.so.1 => /lib/x86_64-linux-gnu/libexpat.so.1 (0x00007fc970c6b000)
libxcb-dri3.so.0 => /usr/lib/x86_64-linux-gnu/libxcb-dri3.so.0 (0x00007fc970a68000)
libxcb-present.so.0 => /usr/lib/x86_64-linux-gnu/libxcb-present.so.0 (0x00007fc970865000)
libxcb-sync.so.1 => /usr/lib/x86_64-linux-gnu/libxcb-sync.so.1 (0x00007fc97065e000)
libxshmfence.so.1 => /usr/lib/x86_64-linux-gnu/libxshmfence.so.1 (0x00007fc97045b000)
libglapi.so.0 => /usr/lib/x86_64-linux-gnu/libglapi.so.0 (0x00007fc97022c000)
libXext.so.6 => /usr/lib/x86_64-linux-gnu/libXext.so.6 (0x00007fc97001a000)
libXdamage.so.1 => /usr/lib/x86_64-linux-gnu/libXdamage.so.1 (0x00007fc96fe17000)
libXfixes.so.3 => /usr/lib/x86_64-linux-gnu/libXfixes.so.3 (0x00007fc96fc11000)
libX11-xcb.so.1 => /usr/lib/x86_64-linux-gnu/libX11-xcb.so.1 (0x00007fc96fa0f000)
libxcb-glx.so.0 => /usr/lib/x86_64-linux-gnu/libxcb-glx.so.0 (0x00007fc96f7f6000)
libxcb-dri2.so.0 => /usr/lib/x86_64-linux-gnu/libxcb-dri2.so.0 (0x00007fc96f5f1000)
libxcb.so.1 => /usr/lib/x86_64-linux-gnu/libxcb.so.1 (0x00007fc96f3cf000)
libXxf86vm.so.1 => /usr/lib/x86_64-linux-gnu/libXxf86vm.so.1 (0x00007fc96f1c9000)
libdrm.so.2 => /usr/lib/x86_64-linux-gnu/libdrm.so.2 (0x00007fc96efb8000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fc96ecaf000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fc96ea92000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fc96e88e000)
/lib64/ld-linux-x86-64.so.2 (0x00007fc97180a000)
libXau.so.6 => /usr/lib/x86_64-linux-gnu/libXau.so.6 (0x00007fc96e68a000)
libXdmcp.so.6 => /usr/lib/x86_64-linux-gnu/libXdmcp.so.6 (0x00007fc96e484000)
so libGL points still to nvidia instead of mesa
This is probably yet another configuration problem (on Ubuntu 16.04). This might be similar to other open issues but I do not have problems with running programs via "optirun" but with normal(primary) X server.
My intended use is CUDA and I had this working some time ago. Now after some updates I've noticed that I've been switched to nvidia as primary so I wanted to go back to the desired setup:
So what I've done is that I purged all nvidia / bumblebee / primus stuff rebooted and reinstalled everything from scratch (I'm using 381 since that was the version I had working and is new enough so that CUDA-8 is not complaining).
I have updated bumblebee config to point to nvidia-381 paths, switched to intel via prime-select and with this I am able to use nvidia card:
Xorg.8.log shows correct glx extension being loded
glxgears show up properly on screen:
and utilities from CUDA examples correctly detect my card (when run via optirun).
So in theory this is a success however with the above setup I've lost GLX extension on my primary "intel" server!
The Xorg log seems to indicate that the valid extension is being loaded:
I've seen some issues with glx being overwritten by nvidia but this seems to be not the case:
Does anybody know how to fix this problem? It e.g. prevents Unity from being loaded (I enter login loop - and I had to switch to awesome in order to log this issue).
I'd appreciate any help Best regards