Open JustCauseWhyNot opened 1 year ago
This probably is only relevant when gamescope is running in nested mode (which I think is the mode that the JustCauseWhyNot is using): I did some basic profiling by running gamescope -f -- glxgears under vktracer, and I saw that there's some periodic gaps in the command buffer which usually seems to begin after the resetFences() command being done after the AcquireNextImageKHR() command
Looking at acquire_next_image() function in rendervulkan.cpp, it seems like resetFences() is always called after WaitForFences() and AcquireNextImageKHR() are successful.
Looking at https://themaister.net/blog/2019/08/14/yet-another-blog-explaining-vulkan-synchronization/ and the code the author wrote for handling fences around AcquireNextImageKHR in https://github.com/Themaister/Granite/blob/master/vulkan/wsi.cpp#L508 Perhaps there is way this can be handled better? (older version of it: https://github.com/Themaister/Granite/blob/141259365a2b837edf0966e8507e45ff4f97cc15/vulkan/wsi.cpp)
I'd like to add that based off of some testing I did if I set -r 139
it kinda helped with at least perceived frame pacing, but if there was any slowing down below the set frame limit it'd chug a lot even if it only went down a frame or two. It seemed as though it wouldn't tear, and I couldn't notice any tearing.
Kinda separately -r
seemed to be a frame limit which I thought based off of gamescope --help description -r
it'd just be setting what the frame rate should be not a limit. But I also found the --frame-limit
option to not have any effect at all for capping frames.
A couple of things you could try:
I also suggest you first try running the game w/o a framerate cap, and without the --immediate-flips flag:
MESA_VK_WSI_PRESENT_MODE=mailbox gamescope -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 8 -S fit -- env MESA_VK_WSI_PRESENT_MODE=immediate gamemoderun %command%
or, if you want to set both gamescope and the game to use mailbox:
MESA_VK_WSI_PRESENT_MODE=mailbox gamescope -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 8 -S fit -- gamemoderun %command%
or if you need to use strangle: (btw, with strangle, you can set a decimal fps cap. I suggest you go to https://www.testufo.com/refreshrate using google chrome/chromium/chromium-based browser to get a fps rate close to your display's refresh rate)
STRANGLE_FPS=<fps-cap-can-be-decimal> STRANGLE_VSYNC=1 STRANGLE_VKONLY=1 ENABLE_VK_LAYER_TORKEL104_libstrangle=1 LD_PRELOAD=libstrangle.so MESA_VK_WSI_PRESENT_MODE=mailbox gamescope -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 8 -S fit -- env LD_PRELOAD="" VK_LOADER_LAYERS_ENABLE="VK_LAYER_FROG_gamescope_wsi" MESA_VK_WSI_PRESENT_MODE=immediate gamemoderun %command%
there is also another vulkan vsync mode: MESA_VK_WSI_PRESENT_MODE=relaxed
which is supposed to dynamically switch between vsync and non-vsync behavior in a way that would feel smoother than pure vsync. Tho IMHO, the relaxed mode doesn't seem to work that well with gamescope
Oh and, this may or may not be compatible with your gpu, but you can also try running gamescope in embedded mode
tho the catch with embedded mode is that display scaling w/ -W -H -w -h can't be used with it (tho I think it will still try to scale the output to fit the screen)
(less overhead compared to nested mode) by switching to a console tty [pressing ctrl+alt+fgamescope -e -w -F nis --sharpness 8 -S fit -- steam -tenfoot -steamos
oh and btw, --frame-limit
only works in this embedded mode. You'd use it like this:
gamescope --frame-limit 139 -r <your-monitor's-refresh-rate-but-rounded-to-whole-number> -e -F nis --sharpness 8 -S fit -- steam -tenfoot -steamos
I tried out what you sent. For the embedded session I got (EE) Failed to read Wayland events: broken pipe
. When running the other 2 top commands I didn't really notice much of a difference. I think I better can articulate what I'm noticing when running gamescope. I'm noticing that I actually believe I'm running at 138 fps(in game cap), but with gamescope It's either not tearing or something else which is resulting in it feels like there's just a bunch of tiny micro stutters or something. When using mangohud I can't measure a difference in frame times when using gamescope vs not using it. I know gamescope has some performance overhead, but I wouldn't expect it to make the input and output just feel bad (I don't know how else to say it).
try setting -r to double your refresh rate See: https://github.com/ValveSoftware/gamescope/issues/734 You can also try:
su
setcap 'CAP_SYS_NICE=eip' $(which gamescope)
echo -1 > /proc/sys/kernel/sched_rt_runtime_us
exit
to exit out of root privilege --rt
Lastly, if you're using the gamescope package installed from your repo, you can try building and installing gamescope from the git repository. The only catch is that there seems to be a nvidia driver bug that causes issues on the git version of gamescope.
However, I was able to get the git version to work on nvidia for me after reverting a commit that seemed to introduce the regression.
See my fork here: https://github.com/sharkautarch/gamescope/tree/nvidia-fix
if you clone the fork using git, make sure to do cd gamescope; git checkout nvidia-fix
so you're on the nvidia-fix
branch
After you build and install from the fork, you'll have to:
vulkan-tools
package, if you don't already have it installedMESA_VK_DEVICE_SELECT=list vulkaninfo
gamescope --prefer-vk-device <id> [other args]
EDIT: it turns out that running bbr on gamescope git version causes bbr to crashLastly, if you're using the gamescope package installed from your repo, you can try building and installing gamescope from the git repository.
I'm using portage, and installed gamescope-3.12.5. I did patch it to include to fix the resizing issue for nvidia gpus which was causing me grief.
diff --git a/src/rendervulkan.cpp b/src/rendervulkan.cpp
index c508979..8087a8b 100644
--- a/src/rendervulkan.cpp
+++ b/src/rendervulkan.cpp
@@ -488,6 +488,7 @@ private:
VK_FUNC(MapMemory) \
VK_FUNC(QueuePresentKHR) \
VK_FUNC(QueueSubmit) \
+ VK_FUNC(QueueWaitIdle) \
VK_FUNC(ResetCommandBuffer) \
VK_FUNC(ResetFences) \
VK_FUNC(UnmapMemory) \
@@ -2988,7 +2989,7 @@ bool vulkan_make_swapchain( VulkanOutput_t *pOutput )
bool vulkan_remake_swapchain( void )
{
VulkanOutput_t *pOutput = &g_output;
- g_device.waitIdle();
+ g_device.vk.QueueWaitIdle( g_device.queue() );
pOutput->outputImages.clear();
MESA_VK_WSI_PRESENT_MODE=mailbox gamescope -r 310 --rt -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun %command%
That's the command I'm passing into steam launch options for bbr. It's still not solving the problem I'm observing.
I'm trying it out on a native game outside of steam, and from a quick little testing time. It's feeling a lot more how I'd expect. Close enough I'd almost say it's resolved sorta, and I was also having the same issue in this game as I'm with bbr. Right now I'd say the issue is with steam, or bbr itself. When I pass for instance gamemoderun to bbr in steam on it's own I also have a issue of it not turning on in the game.
After more testing for the native linux game if I pass strangle 139
It'll cause the problem to occur. I do wanna cap my frames, but the game doesn't have a native frame limiter. Any suggestions? Since gamescopes framelimiter doesn't really work for me.
It seems like anything that brings my frames below 140 cause major stuttering issues, but if I wasn't using gamescope those issues wouldn't occur.
I have been experimenting with making gamescope's vblanking system more accurate and also have a lower latency. Tho it still may be a bit buggy, and also it incorporates some busy-waiting which reduces latency but also is less energy efficient. it is in the nvidia-fix-and-vblank-debug branch of my fork The only catch is that it is based off of the latest git version of gamescope, with a workaround for one of the nvidia bugs that causes gamescope to crash.
HOWEVER, all branches of my fork still crash w/nvidia when running games like bbr on gamescope.
I might make another branch based off of an older working version of gamescope that incorporates my vblanking experimentations
Note that if you want to test out my nvidia-fix-and-vblank-debug branch, I would recommend you not to pass an -r
to my version of gamescope
I'll try out your branch. Thanks for informing me about it, and putting in the work.
oh yea btw for some reason, it seems like my branch of gamescope crashes if it is given CAP_SYS_NICE
if you gave it CAP_SYS_NICE, just run:
setcap '' $(which gamescope)
to reset it...
HOWEVER, all branches of my fork still crash w/nvidia when running games like bbr on gamescope.
...
I'm using portage, and installed gamescope-3.12.5.
Now that I think about it, bbr did manage to run when I turned off an environment variable: ENABLE_GAMESCOPE_WSI=0 ... gamescope ...
I didn't mention it before because it is unideal to run bbr that way, since ENABLE_GAMESCOPE_WSI=1
lets gamescope import frames directly from the game via a vulkan layer.
looking at this commit: https://github.com/ValveSoftware/gamescope/commit/6b6ffcdaf80b926f96fd610717378240df918707
I can tell that versions of gamescope newer than v3.12.5 default to ENABLE_GAMESCOPE_WSI=1
instead of ENABLE_GAMESCOPE_WSI=0
I remember that bbr's error window when launching my nvidia-fix branch said something about the game not supporting some window resolution (I think it was whatever gamescope's nested window size was set to). I think the problem is that gamescope changes the resolution/screen size of the game's launcher/initial loading swapchain (or the d3d equivalent to that is), and for whatever reason, this causes an error.
If bbr crashes when running it w/ v3.12.5 gamescope with ENABLE_GAMESCOPE_WSI=1 gamescope <other params> %command%
then that would support my hypothesis.
Running Exec=env __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=mailbox gamescope --rt -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/launcher
with your branch launches the launcher, but when I press enter to launch the game it insta crashes. I don't even see anything I can just tell because of the title saying WT: crash report. I ran doas setcap '' $(which gamescope)
also, but didn't seem to make any difference.
Try running it with ENABLE_GAMESCOPE_WSI=0
That fixed auto crashing :+1:, but I'm still noticing significant latency or what feels like it no matter the framerate.
That fixed auto crashing 👍, but I'm still noticing significant latency or what feels like it no matter the framerate.
Try out my new branch: nvidia-fix-and-vblank-debug-extra-experimental might still give you momentary periods of stutter, but hopefully it will be an improvement in framepacing
EDIT: also I think sometimes this new branch crashes when starting, but should work after trying to launch it a couple times
Exec=env __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 ENABLE_GAMESCOPE_WSI=1 MESA_VK_WSI_PRESENT_MODE=mailbox gamescope --rt -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
And if I use ENABLE_GAMESCOPE_WSI=0
I get broken pipe over and over.
I know you said it'd fail a couple times, but I tried 5-10 times at least.
Replying to https://github.com/ValveSoftware/gamescope/issues/995#issuecomment-1789370412
I updated the nvidia-fix-and-vblank-debug-extra-experimental branch to fix the issue
Plus also some more improvements to make the framepacing quite smoother.
From my testing, running gamescope with a nested refresh rate set to 120 hz via -r 120
is now pretty smooth for me, which wasn't the case before.
full command I used for testing:
GDK_BACKEND=wayland __GL_SYNC_TO_VBLANK=0 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=immediate gamescope --immediate-flips -r 120 -o 120 -f -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit --prefer-vk-device 10de:25a0 -- env vblank_mode=0 vrrtest
Tho I will say running it with -r 140
was less smooth, but idk if that is just because my actual laptop display is 60hz, so maybe thats just because 120 is a multiple of 60hz, but 140 isn't
Make sure to run git pull
to download the new commits to the branch
full command I used for testing:
GDK_BACKEND=wayland __GL_SYNC_TO_VBLANK=0 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=immediate gamescope --immediate-flips -r 120 -o 120 -f -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit --prefer-vk-device 10de:25a0 -- env vblank_mode=0 vrrtest
Is that what you're recommending I use?
It did boot when I used the previous command I pasted here, but I'm getting gamescope: ../gamescope-9999/src/wlserver.cpp:1659: void wlserver_key(uint32_t, bool, uint32_t): Assertion `wlserver.wlr.virtual_keyboard_device != nullptr' failed.
now. It now keeps prompting a crash report. And here is the command I ran Exec=env __GL_SYNC_TO_VBLANK=0 __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=immediate gamescope --immediate-flips -r 310 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
.
full command I used for testing:
GDK_BACKEND=wayland __GL_SYNC_TO_VBLANK=0 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=immediate gamescope --immediate-flips -r 120 -o 120 -f -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit --prefer-vk-device 10de:25a0 -- env vblank_mode=0 vrrtest
Is that what you're recommending I use?
No, that was just the command I used to run a program which I find helpful to quickly test the framepacing behavior of gamescope
It did boot when I used the previous command I pasted here, but I'm getting
gamescope: ../gamescope-9999/src/wlserver.cpp:1659: void wlserver_key(uint32_t, bool, uint32_t): Assertion `wlserver.wlr.virtual_keyboard_device != nullptr' failed.
now. It now keeps prompting a crash report. And here is the command I ranExec=env __GL_SYNC_TO_VBLANK=0 __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=immediate gamescope --immediate-flips -r 310 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
.
make sure to add ENABLE_GAMESCOPE_WSI=0
for vulkan/proton games
For some vulkan, non-proton, apps, it may just work without needing to set ENABLE_GAMESCOPE_WSI=0
if you switch MESA_VK_WSI_PRESENT_MODE=immediate
to MESA_VK_WSI_PRESENT_MODE=mailbox
hopefully there'll be an nvidia driver update at some point that'll fix whatever bug is breaking the gamescope WSI layer.
For now, you'll have to set ENABLE_GAMESCOPE_WSI=0
to disable the WSI layer
also, check that you recompiled gamescope with the latest commits I put out: you can check by doing:
cd gamescope
git branch
-> make sure the star is next to nvidia-fix-and-vblank-debug-extra-experimental
q
to quit from git branchgit log
-> check that you see commit 51f18962804c37adbbba1672627b7206524c351e <... other text>
if you did set ENABLE_GAMESCOPE_WSI=0 and the game still crashed, then I think MESA_VK_WSI_PRESENT_MODE needs to be MESA_VK_WSI_PRESENT_MODE=mailbox
instead of MESA_VK_WSI_PRESENT_MODE=immediate
So try either of these commands: (not sure if setting -r to double your refresh will make it more or less stuttery)
Exec=env ENABLE_GAMESCOPE_WSI=0 vblank_mode=0 __GL_SYNC_TO_VBLANK=0 __GL_SHARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=immediate gamescope --immediate-flips -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
Exec=env ENABLE_GAMESCOPE_WSI=0 vblank_mode=0 __GL_SYNC_TO_VBLANK=0 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=immediate gamescope --immediate-flips -r 310 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
^ I'm assuming 310 is double your refresh rate
No crashing, and it actually feels way better. But I still am having issues. I do thank you for all you've done to help and contribute.
No crashing, and it actually feels way better. But I still am having issues. I do thank you for all you've done to help and contribute.
Another update: check out a newer version of my fork w/ my new nvidia-fix-and-vblank-debug-extra-experimental-v2 branch
if the nvidia-fix-and-vblank-debug-extra-experimental-v2
branch behaves worse for you,
you can always go back to the nvidia-fix-and-vblank-debug-extra-experimental
branch, which has remained unchanged
Notable difference in terms of usage are a couple switches I added to the new branch:
--vblank-never-spin Vblank thread: Never busywait to send the next vblank
--vblank-sometimes-spin Vblank thread: Sleep for most of the waiting time ~(3/4),
then wakeup and busywait the quarter remaining time to reduce latency. (Default)
--vblank-always-spin Vblank thread: Busywait the period between sending the next vblank as much as possible.
I'll 100% give this a try!
I'll 100% give this a try!
Update: I just fixed a bug I had made which was causing random crashing.
Run git pull
if you had already downloaded from the nvidia-fix-and-vblank-debug-extra-experimental-v2
On portage I've made a local ebuild of gamescope where I've set it to a live version which when built will always pull the latest commit, and in the ebuild I've set it to your repo & branch. So I can just update gamescope through portage. All I've gotta do is change the branch name for the correct branch in your repo. I just like portage so I figured I'd share how I update gamescope with you, and anyone else who reads this.
If I try and build your branch it fails to compile. Here's my build.log
If I try and build your branch it fails to compile. Here's my build.log
FAILED: layer/libVkLayer_FROG_gamescope_wsi_x86_64.so.p/VkLayer_FROG_gamescope_wsi.cpp.o x86_64-pc-linux-gnu-g++ -Ilayer/libVkLayer_FROG_gamescope_wsi_x86_64.so.p -Ilayer -I../gamescope-9999/layer -Iprotocol -I/usr/lib64/libffi/include -fdiagnostics-color=always -D_FILE_OFFSET_BITS=64 -Wall -Winvalid-pch -Wextra -std=c++20 -DWLR_USE_UNSTABLE -Wno-unused-parameter -Wno-missing-field-initializers -Wno-invalid-offsetof -Wno-unused-const-variable -Wno-volatile -Wno-ignored-qualifiers -Wno-missing-braces -ffast-math -DHAVE_PIPEWIRE=1 -DHAVE_OPENVR=0 '-DHWDATA_PNP_IDS="//usr/share/hwdata/pnp.ids"' -march=native -O2 -pipe -fPIC -MD -MQ layer/libVkLayer_FROG_gamescope_wsi_x86_64.so.p/VkLayer_FROG_gamescope_wsi.cpp.o -MF layer/libVkLayer_FROG_gamescope_wsi_x86_64.so.p/VkLayer_FROG_gamescope_wsi.cpp.o.d -o layer/libVkLayer_FROG_gamescope_wsi_x86_64.so.p/VkLayer_FROG_gamescope_wsi.cpp.o -c ../gamescope-9999/layer/VkLayer_FROG_gamescope_wsi.cpp ../gamescope-9999/layer/VkLayer_FROG_gamescope_wsi.cpp: In static member function ‘static void GamescopeWSILayer::VkInstanceOverrides::GetPhysicalDeviceFeatures2(const vkroots::VkInstanceDispatch, VkPhysicalDevice, VkPhysicalDeviceFeatures2)’: ../gamescope-9999/layer/VkLayer_FROG_gamescope_wsi.cpp:504:54: error: ‘FindInChainMutable’ is not a member of ‘vkroots’ 504 | auto pSwapchainMaintenance1Features = vkroots::FindInChainMutable<VK_STRUCTURE_TYPE_PHYSICAL_DEVICE_SWAPCHAIN_MAINTENANCE_1_FEATURES_EXT, VkPhysicalDeviceSwapchainMaintenance1FeaturesEXT>(pFeatures); | ^
~~~~~ ../gamescope-9999/layer/VkLayer_FROG_gamescope_wsi.cpp: In function ‘uint32_t GamescopeWSILayer::gamescopeFrameLimiterOverride()’: ../gamescope-9999/layer/VkLayer_FROG_gamescope_wsi.cpp:128:10: warning: ignoring return value of ‘ssize_t pread(int, void*, size_t, __off64_t)’ declared with attribute ‘warn_unused_result’ [-Wunused-result] 128 | pread(fd, &overrideValue, sizeof(overrideValue), 0);
hmmm I think that must of have been the result of me merging in newer commits from upstream gamescope...
On the otherhand... looks like that same exact compiler error was posted about recently here: https://github.com/ValveSoftware/gamescope/issues/1015#issuecomment-1814846104
It looks like it includes /usr/include/vkroots.h when you have older version installed. $ pkgfile /usr/include/vkroots.h extra/gamescope I have the same problem while compiling it locally and having older gamescope version installed by pacman. Adding vkroots to force_fallback_for in meson.build solved the problem. In your case it might be enough to remove old version before installing new one.
I guess I can just edit my branch's meson.build file to force the fallback for vkroots...
EDIT: I just pushed a new commit to the nvidia-fix-and-vblank-debug-extra-experimental-v2
branch to force the fallback for vkroots
@JustCauseWhyNot let me know if that helps
I will try that out today. I don't know why github didn't show a notification for your new message.
No worries By the way, I just pushed out a new branch nvidia-fix-and-vblank-debug-extra-experimental-v3 The changes I made to the new branch were significant enough that I felt it warranted it being a new branch
If you try the v3 branch out, let me know if there are any issues you find
Going to try it rn.
Failed to build. https://gist.github.com/JustCauseWhyNot/57386bb9e66c165f93f4a238f1a655dd
Ok, I just pushed another commit to nvidia-fix-and-vblank-debug-extra-experimental-v3 which reverts the recent merge-commit which had pulled in new commits from upstream gamescope Let me know if it compiles for you now
It did compile, but running Exec=env __GL_SYNC_TO_VBLANK=0 __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=immediate gamescope --immediate-flips -r 155 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
Causes crash on launch.
It did compile, but running
Exec=env __GL_SYNC_TO_VBLANK=0 __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=immediate gamescope --immediate-flips -r 155 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
Causes crash on launch.
hmmm...
try running either:
Exec=env __GL_SYNC_TO_VBLANK=0 __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=mailbox gamescope --immediate-flips -r 155 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
or
Exec=env __GL_SYNC_TO_VBLANK=0 __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=immediate ENABLE_GAMESCOPE_WSI=0 gamescope --immediate-flips -r 155 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
or
Exec=env __GL_SYNC_TO_VBLANK=0 __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=mailbox ENABLE_GAMESCOPE_WSI=0 gamescope --immediate-flips -r 155 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
Lol I'll do that.
Lol I'll do that.
If it still doesn't work, I've made yet another branch: nvidia-fix-and-vblank-debug-extra-experimental-v4 which has all the work I've done in the v3 branch, but based off of an older revision of gamescope upstream
EDIT: I've now tested building and running nvidia-fix-and-vblank-debug-extra-experimental-v4 and it works for me
I've used the v4 branch.
Exec=env __GL_SYNC_TO_VBLANK=0 __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=mailbox gamescope --immediate-flips -r 155 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
That's an instant crash.
Exec=env __GL_SYNC_TO_VBLANK=0 __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=immediate ENABLE_GAMESCOPE_WSI=0 gamescope --immediate-flips -r 155 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
It works, but there's quite frequent, but inconsistent flickering.
Exec=env __GL_SYNC_TO_VBLANK=0 __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=mailbox ENABLE_GAMESCOPE_WSI=0 gamescope --immediate-flips -r 155 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
It's hard to tell if this is any diff than #2, but it does feel a bit smoother. It also may have even more frequent flickering, and the screen flickering kinda looks a bit different. I've not done super extensive testing, but I've played around with it for an hour.
I don't have screen capturing software installed. So I can only describe with my words. It's starting to feel actually really good. Enough so that without the flickering I might actually use gamescope over not using it.
Replying to https://github.com/ValveSoftware/gamescope/issues/995#issuecomment-1821480534
Well at least it looks like I've made some overall progress
Quick question: is -r 155
the closest whole number to your monitor's refresh rate? Just want to double check, because if you specify a value that isn't the closest whole number to your monitor's refresh rate (or a multiple of said whole number), you'll probably get funky behavior.
also here's another command you could try:
Exec=env __GL_SYNC_TO_VBLANK=0 __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 ENABLE_GAMESCOPE_WSI=0 gamescope -r 155 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- env MESA_VK_WSI_PRESENT_MODE=mailbox gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
If you want to provide me more info on gamescope's behavior there's a couple things you can do:
Anyhow, thanks for trying out my work!
EDIT: oh, one more question: have you tried running bbr w/ gamescope, and if so, does the behavior feel any different compared to running war thunder w/ gamescope EDIT EDIT: also, I recommend turning off the in-game vsync setting for any game you run w/ gamescope
EDIT EDIT EDIT: another command to try:
Exec=env MESA_VK_WSI_PRESENT_MODE=mailbox ENABLE_GAMESCOPE_WSI=0 gamescope -r 155 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
Well at least it looks like I've made some overall progress
Definitely.
Quick question: is -r 155 the closest whole number to your monitor's refresh rate? Just want to double check, because if you specify a value that isn't the closest whole number to your monitor's refresh rate (or a multiple of said whole number), you'll probably get funky behavior.
Yes it is. nvidia-settings shows refresh rate at 154.85Hz.
also here's another command you could try:
Exec=env __GL_SYNC_TO_VBLANK=0 __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 ENABLE_GAMESCOPE_WSI=0 gamescope -r 155 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- env MESA_VK_WSI_PRESENT_MODE=mailbox gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
I did try it, but I don't know if it was better or worse in terms of response times & such. And sadly not much different visual flickering possible a little better, but hard to tell.
If you want to provide me more info on gamescope's behavior there's a couple things you can do:
run a program with mangoapp (I'm not sure how to do this with steam games).
note that it seems like running mangoapp seems to actually cause more stuttering
example: `gamescope env __GL_SYNC_TO_VBLANK=0 __GL_SARPEN_ENABLE=1 __GL_SHARPEN_VALUE=50 __GL_SHARPEN_IGNORE_FILP_GRAIN=17 MESA_VK_WSI_PRESENT_MODE=mailbox ENABLE_GAMESCOPE_WSI=0 gamescope --immediate-flips -r 155 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun sh -c 'mangoapp & vkcube'
launch steam thru the terminal so that you can view the stdout of gamescope
EDIT: or, if you're not using steam, just checking the output of gamescope in your terminal
I'll give that a try, and report back.
Anyhow, thanks for tying out my work!
You're the one I should be thanking. If you can get gamescope to work really well for me I'd be very grateful. I'm already very ingratiated towards you for the support, and work you've provided. Thank you!
EDIT: oh, one more question: have you tried running bbr w/ gamescope, and if so, does the behavior feel any different compared to running war thunder w/ gamescope
I've kinda stopped playing it, or trying to use gamescope to test it. I can test it out for you though.
EDIT EDIT: also, I recommend turning off the in-game vsync setting for any game you run w/ gamescope
I've had vsync off this entire time, but in bbr I can disable the in game frame limiter as well.
EDIT EDIT EDIT: another command to try: Exec=env MESA_VK_WSI_PRESENT_MODE=mailbox ENABLE_GAMESCOPE_WSI=0 gamescope -r 155 -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 10 -S fit -- gamemoderun /home/justcausewhynot/Games/WarThunder/linux64/aces
This is definitely the best experience yet. It's much better, but some new frames cause not a flicker but a weird refresh with pushing the new frame. I should also add that on low frame rates the game is super janky. Trying to say just pan the camera around causes it to jerk around a lot.
Also my apologies for leaving you hanging for a few days, and Happy Thanksgiving!
Happy thanksgiving to you as well.
I've done some more tweaks to the code, and I've also merged in newer upstream commits... new branch: nvidia-fix-and-vblank-debug-extra-experimental-v4.5 Though I'm not sure if you'll notice all too much of a difference.
This is definitely the best experience yet. It's much better, but some new frames cause not a flicker but a weird refresh with pushing the new frame. I should also add that on low frame rates the game is super janky. Trying to say just pan the camera around causes it to jerk around a lot.
Yeah I think I have some idea of the part gamescope's code that is still giving an issue... I think that the pipe2() thingy which currently moves the vblank signals being generated from the vblankmanager thread to the steamcompmgr thread can be prone to periodic latency and jitter.
I'm thinking about replacing the pipe2() thing with an atomic ringbuffer, which hopefully might have a lower overhead.
in the meantime, I added some code to nvidia-fix-and-vblank-debug-extra-experimental-v4.5 which prints out the latency of the pipe2() thing (example output: dispatch_vblank(int): VBlankTimeInfo_t receive latency: 0.07ms
).
If you end up building and running the nvidia-fix-and-vblank-debug-extra-experimental-v4.5, I would love to see what the terminal output shows for you...
Only just saw this, interesting. I'll have to try it out when I'm back.
I've been tempted to remove the vblank thread and just replace it with timerfd and updating it every vblank in the main thread.
Its one of the things I want to try when I'm back.
Only just saw this, interesting. I'll have to try it out when I'm back.
I've been tempted to remove the vblank thread and just replace it with timerfd and updating it every vblank in the main thread.
Its one of the things I want to try when I'm back.
the one advantage I've found with the vblank thread is that it can do some framepacing calculations while waiting for the next vblank. In the code I've wrote, I measure the rate of change in the drawTime (delta-drawTime), and then limit the change in rollingMaxDrawTime based on the change in delta-drawTime (delta^2 -drawTime). This seems to smooth out the frametiming (sawtooth pattern -> curve pattern)
As for what I was mentioning about using a ringbuffer, here's a header-only c++ lockless atomic ringbuffer library I found: https://github.com/rigtorp/SPSCQueue
If you end up building and running the nvidia-fix-and-vblank-debug-extra-experimental-v4.5, I would love to see what the terminal output shows for you...
I'll 100% share that. Today I'll test it out.
If you end up building and running the nvidia-fix-and-vblank-debug-extra-experimental-v4.5, I would love to see what the terminal output shows for you...
I'll 100% share that. Today I'll test it out.
Oh one thing I’ve recently noticed is that when I played a game with my branch of gamescope: the game was a lot less stutters when I set gamescope’s nested refresh rate to double my display’s refresh rate.
For my 59.95Hz display, that meant adding -r 120 -o 120
For your display, double your refresh rate would mean using -r 310 -o 310
Let me know if you also see any difference with that
(sawtooth pattern -> curve pattern)
You need sawtooth to meet the deadline though.
I can't build 4.5. https://gist.github.com/JustCauseWhyNot/a9b03f6d10d539e0147ece775b83a429
I can't build 4.5. https://gist.github.com/JustCauseWhyNot/a9b03f6d10d539e0147ece775b83a429
./gamescope-9999/layer/VkLayer_FROG_gamescope_wsi.cpp: In static member function ‘static void GamescopeWSILayer::VkInstanceOverrides::GetPhysicalDeviceFeatures2(const vkroots::VkInstanceDispatch, VkPhysicalDevice, VkPhysicalDeviceFeatures2)’: ../gamescope-9999/layer/VkLayer_FROG_gamescope_wsi.cpp:504:54: error: ‘FindInChainMutable’ is not a member of ‘vkroots’ 504 | auto pSwapchainMaintenance1Features = vkroots::FindInChainMutable<VK_STRUCTURE_TYPE_PHYSICAL_DEVICE_SWAPCHAIN_MAINTENANCE_1_FEATURES_EXT, VkPhysicalDeviceSwapchainMaintenance1FeaturesEXT>(pFeatures);
huh uhhh Idk why that is happening...
(sawtooth pattern -> curve pattern)
You need sawtooth to meet the deadline though.
ok well it still gives a sawtooth pattern (because of vblank_next_target not being restricted in how much it can change), just like a 'bounded' sawtooth pattern
this is what mangoapp w/ glxgears looks like (this is specifically with my latest nvidia-fix-and-vblank-debug-untested-newer-commits-from-upstream
branch)
EDIT: uhhh now that I've run it again it looks different lol (though I think this is still kinda a micro-sawtooth pattern if you look closely enough)
@Joshua-Ashton EDIT EDIT: I actually decided to update my 4.6 branch to use two separate rollingMaxDrawTime variables:
I'm launching bbr with
gamescope -w 1970 -h 1108 -W 1440 -H 2560 -F nis --sharpness 8 -S fit --immediate-flips -- strangle 139 gamemoderun %command%
. It's doing a good job of upscaling, but despite setting immediate-flip it feels like I'm playing on 60fps. . As shown in ss the frametiming is near perfect, and stays that way while I input into the game. But it still feels bad to use. I don't know if it's just a limitation with the compositor, or if there's a options I'm missing.I am using dwm/x11, and no compositor of my own. And lastly I'm using an gtx 1660 super. I'm not sure what else to write down, but I'd like to have the upscaling while not having to use a compositor. But the pr to merge into proton got closed :disappointed:.