yshui / picom

A lightweight compositor for X11 with animation support
https://picom.app/
Other
4.1k stars 585 forks source link

Vsync forced on glx backend #1203

Open Shringe opened 7 months ago

Shringe commented 7 months ago

Platform

Arch Linux x86 NVIDIA

GPU, drivers, and screen setup

direct rendering: Yes Memory info (GL_NVX_gpu_memory_info): Dedicated video memory: 8192 MB Total available memory: 8192 MB Currently available dedicated video memory: 7460 MB OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: NVIDIA GeForce RTX 3070/PCIe/SSE2 OpenGL core profile version string: 4.6.0 NVIDIA 545.29.06 OpenGL core profile shading language version string: 4.60 NVIDIA OpenGL core profile context flags: (none) OpenGL core profile profile mask: core profile

OpenGL version string: 4.6.0 NVIDIA 545.29.06 OpenGL shading language version string: 4.60 NVIDIA OpenGL context flags: (none) OpenGL profile mask: (none)

OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 545.29.06 OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20

Environment

qtile and i3 (tested)

picom version

vgit-fc1d1

Diagnostics

Version: vgit-fc1d1

Extensions:

Misc:

Drivers (inaccurate):

NVIDIA

Backend: glx

Backend: egl

Steps of reproduction

  1. Use Nvidia(?)
  2. disable vsync
  3. enable glx backend

Expected behavior

vsync should be disabled like xrender

Current Behavior

vysnc stays enabled regardless

Other details

Picom and any fork of it force vsync with the glx backend. This is problematic because in X11, multi-monitor mixed refresh-rate setups always vsync to the lowest common refresh-rate. The only way to get the proper refresh rate of mixed refresh-rate setups in X11 is to disable vsync.

This may or may not be related but I also can not get dual_kawase blur to work, and glx CPU usage isn't much lower than xrender despite being GPU accelerated.