direct rendering: Yes
Memory info (GL_NVX_gpu_memory_info):
Dedicated video memory: 8192 MB
Total available memory: 8192 MB
Currently available dedicated video memory: 7460 MB
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: NVIDIA GeForce RTX 3070/PCIe/SSE2
OpenGL core profile version string: 4.6.0 NVIDIA 545.29.06
OpenGL core profile shading language version string: 4.60 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL version string: 4.6.0 NVIDIA 545.29.06
OpenGL shading language version string: 4.60 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)
OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 545.29.06
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
Environment
qtile and i3 (tested)
picom version
vgit-fc1d1
Diagnostics
Version: vgit-fc1d1
Extensions:
Shape: Yes
RandR: Yes
Present: Present
Misc:
Use Overlay: Yes
Config file used: /etc/xdg/picom.conf
Drivers (inaccurate):
NVIDIA
Backend: glx
Driver vendors:
GLX: NVIDIA Corporation
GL: NVIDIA Corporation
GL renderer: NVIDIA GeForce RTX 3070/PCIe/SSE2
Backend: egl
Driver vendors:
EGL: NVIDIA
GL: NVIDIA Corporation
GL renderer: NVIDIA GeForce RTX 3070/PCIe/SSE2
Steps of reproduction
Use Nvidia(?)
disable vsync
enable glx backend
Expected behavior
vsync should be disabled like xrender
Current Behavior
vysnc stays enabled regardless
Other details
Picom and any fork of it force vsync with the glx backend. This is problematic because in X11, multi-monitor mixed refresh-rate setups always vsync to the lowest common refresh-rate. The only way to get the proper refresh rate of mixed refresh-rate setups in X11 is to disable vsync.
This may or may not be related but I also can not get dual_kawase blur to work, and glx CPU usage isn't much lower than xrender despite being GPU accelerated.
Platform
Arch Linux x86 NVIDIA
GPU, drivers, and screen setup
direct rendering: Yes Memory info (GL_NVX_gpu_memory_info): Dedicated video memory: 8192 MB Total available memory: 8192 MB Currently available dedicated video memory: 7460 MB OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: NVIDIA GeForce RTX 3070/PCIe/SSE2 OpenGL core profile version string: 4.6.0 NVIDIA 545.29.06 OpenGL core profile shading language version string: 4.60 NVIDIA OpenGL core profile context flags: (none) OpenGL core profile profile mask: core profile
OpenGL version string: 4.6.0 NVIDIA 545.29.06 OpenGL shading language version string: 4.60 NVIDIA OpenGL context flags: (none) OpenGL profile mask: (none)
OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 545.29.06 OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
Environment
qtile and i3 (tested)
picom version
vgit-fc1d1
Diagnostics
Version: vgit-fc1d1
Extensions:
Misc:
Drivers (inaccurate):
NVIDIA
Backend: glx
Backend: egl
Steps of reproduction
Expected behavior
vsync should be disabled like xrender
Current Behavior
vysnc stays enabled regardless
Other details
Picom and any fork of it force vsync with the glx backend. This is problematic because in X11, multi-monitor mixed refresh-rate setups always vsync to the lowest common refresh-rate. The only way to get the proper refresh rate of mixed refresh-rate setups in X11 is to disable vsync.
This may or may not be related but I also can not get dual_kawase blur to work, and glx CPU usage isn't much lower than xrender despite being GPU accelerated.