Open Pshemas opened 8 years ago
consider me puzzled - it seems the biggest impact on the "jumpiness" is ... whether I have Chrome browser open. When I have most of the windows seems to slow down. What the heck? I'm checking whether there are other apps that trigger this.
I am sure there are issues about this problem already, but the glx
backend, in contrast to the xrender
backend, is very laggy/stuttery/slow when moving windows, especially if you are playing games or videos. That would explain that it is laggier when you move windows with chrome open.
Brottweiler - but the tricky thing is Chrome makes other windows on the same monitor laggy too. Close Chrome or move window to 2nd monitor and all is back to normal. Also it doesn't matter whether you open a "heavy" page with videos or a blank page - Chrome makes other windows laggy in either case. Open a heavy page in let's say Firefox and all is fine. I'd happily switch to xrender backend, but I haven't figured out howto set it up in a way that would eliminate tearing. Any ideas?
Yes, that is true. If I only have a terminal emulator open, moving it is smooth. If I would play a video, or a game, or possibly just have chrome open aswell, moving any window will be laggy. All windows on that workspace will be laggy. I am not sure about monitors because I only have one currently.
I think for chrome, it is laggy as it is when just having chrome open. But if you also play a youtube video in chrome, I think it gets even laggier.
Open a heavy page in let's say Firefox and all is fine.
Oh, so only chrome makes this issue happen? I've not tested Firefox.
If you want to use xrender
and try to avoid screen tearing, you could look into wiki.archlinux/NVIDIA/Troubleshooting#Avoid_screen_tearing. I've encountered issues with using the 20-nvidia.conf
X config file, but the command works fine. Try that.
Thx Brottweiler - will try that :+1: . Also I've found even worse offender - glxgears :D . With them open all the windows get CRAZY laggy (even on a PC designed to handle 3D design apps).
tried the: nvidia-settings --assign CurrentMetaMode="nvidia-auto-select +0+0 { ForceFullCompositionPipeline = On }"
For me, screen flashes for less than a second, it doesn't disconnect the screen... But could be different for you, ofcourse...
out of curiosity - what is your xrandr output? Maybe if I find the difference in the way our screens are set up I'll be able to use "forcefullcompositionpipeline".
Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 16384 x 16384
DVI-I-0 disconnected (normal left inverted right x axis y axis)
DVI-I-1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 531mm x 299mm
1920x1080 60.00*+
1680x1050 59.95
1600x900 60.00
1280x1024 75.02 60.02
1280x800 59.81
1280x720 60.00
1024x768 75.03 60.00
800x600 75.00 60.32
640x480 75.00 59.94
HDMI-0 disconnected (normal left inverted right x axis y axis)
HDMI-1 disconnected (normal left inverted right x axis y axis)
DVI-D-0 disconnected (normal left inverted right x axis y axis)
ahhh... ok, but it seems you tested it on a single screen setup. Mine is different - 2 monitors connected :/ . SO on my setup this command disconnects one of the monitors. I'm digging through the net looking for a way to use "forcefullcompositionpipeline" on a multi monitor setup. Let me know if anything crosses your mind.
@Pshemas ah, maybe this will help https://www.gamingonlinux.com/articles/i-have-finally-found-a-way-to-sort-out-screen-tearing-on-nvidia-with-linux.7213
Thank you Brottweiler! That helped a lot - thx to it I've been able to figure how should I set it up on my end : nvidia-settings --assign CurrentMetaMode="DP-4:2560x1440_60 +0+0 { ForceFullCompositionPipeline = On }, HDMI-0:1920x1080_60 +2560+0 { ForceFullCompositionPipeline = On }"
I'll allow myself to but it here.
Regarding the issue at hand and the circumstances that make it happen (Video playback whether in browser or local media player, games or otherwise OpenGL GPU accel. dependent programs).
This thread made me go back to the problem and trying to solve it.
I have a Keplex GPU and is using the latest (370.28) NVIDIA driver together with Compton. Though I have achieved reliable VSYNC without needing to tinker with my xorg.conf, up until a moment ago.
Though it persist with applications such as glxgears, I no longer experience the issue when using the "glx-swap-method" "buffer-age". I'm using "glx" as a backend with vsync achieved with the "opengl-swc" method.
I hope some people can test and report.
No go here, Have a Kepler based GPU on 370.28, using Compton with any set of settings, including glx + opengl-swc + buffer-age, causes windows to move noticeably slower than without a compositor or a different compositor. Using buffer-age caused programs to get crazy broken with flipping between old contents and new contents.
I know they say don't use glxgears to test but if I use compiz or mutter for example, there is no tearing, windows don't move slow, program contents update quickly, and glxgears doesn't slow down the system. (I cannot use compiz or mutter due to compatibility issues with MATE)
EDIT: Using ForceCompositionPipeline (not Full, but Full works too) appears to get rid of tearing, I can use marco's compositor instead with that and get no tearing, fast windows, quick program updates, and even though glxgears is still slowing stuff down it updates faster than compton.
I've been playing more with the settings and I've noticed that while setting the backend to GLX elimantes tearing , it introduces one issue on my dual monitor setup - moving windows stops being smooth, it's a bit "jumpy". Setting it to xrender does not introduce this problem, but sadly there's a noticeable tearing, especially visible on one of monitors (2nd one seems at the very least almost tear free). Any ideas what's the cause and how I could fix it? I'm using Nvidia GTX 980 with proprietary driver 367.44 . Here's xrandr output: