Open ruanformigoni opened 1 year ago
Hi, you seem to understand this better than me - I've tried to downscale from 4K to my native resolution, however it seems like currently it doesn't supersample at all - the pixelation remains the same if I run it at 1080p or at 2160p.
Your screenshots indicate the same.
So would it really need to use RSR, or would simple supersampling be sufficient? I remember back in 2016 on Windows I used to use AMD Virutal Super resolution, it made the image super smooth, and back then RSR definitely didn't exist.
Hi @rKsanu2MMYvypWePtQWM , I don't have extensive knowledge on this, I'm just enthusiastic :). Something like AMD virtual resolution would be great.
Perhaps the entire downscaling issue could be fixed by making sure it's getting downscaled properly? Because currently it doesn't seem to do it very well...
I've done similar testing and came to the same conclusion: Gamescope simply does not have downscaling/supersampling support yet.
RetroArch is a perfect example of what kind downscaling options should be available for Gamescope, that's just due to the nature of the games being played over there. Essentially, RetroArch users want to run an era-appropriate resolution (320x240) with integer scaling but also render internally 2x, 3x, 4x, 8x, ect. This is not possible with Xrandr or Gamescope but absolutely could be added in.
An example in Gamescope: Halo Master Chief Collection, I'll load up Halo 2 Aniversary (classic). I want the render resolution set to 1920x1440 (3x) and then that should be scaled back down to my target of 640x480. There's no way to properly do this as of right now.
In Windows, AMD's Virtual Super Resolution (VSR) can do downscaling/supersampling but only to your monitor's native resolution, eg. 3200x1800 (VSR) downscales to 1920x1080 (native). I don't want to downscale only to native and Windows now hard caps the desktop resolution to 800x600 at the lowest.
Here's a video that shows exactly what this looks like and will give you an idea of why this feature is desirable:
Hi, I've implemented a bicubic downscaling algorithm (from this source) in PR #740. To test it:
git clone https://github.com/ruanformigoni/gamescope.git
cd gamescope
git checkout issue-692-bicubic
And then follow the instructions on the building section.
You can use the downscaler with the -D
flag:
gamescope -D -W 2560 -H 1080 -w 5120 -h 2160 game-executable
I've tested your downscaling, it works pretty well.
couple example images taken on my 1920x1440p display downscaling from 3840x2880 with integer overscale on in retroarch.
original(outside gamescope, no scaling):
your bicubic:
gamescope's original nearest neighbor downscaling:
Pixel interpolation on the downscale indeed does reduce aliasing. There might be even better quality(zoom in on mario, you can see some uneven edges on bicubic) with lanzcos/sinc. I think integer downscaling is probably less bad for 3D graphics the higher the starting resolution on your display is. For 2d graphics it's fine(just so long as you don't use an integer scaling scanline shader, that won't work).
I tried it out and it works pretty well!
I found a scaler that may be more appropriate than bicubic. Sharp-Bilinear, there's a link to it's GitHub in here and it appears to have integer scaling built in as well.
Sharp-Bilinear is for upscaling. Works by going to the nearest integer then finishing up with bilinear interpolation. It won't help here as the default for gamescope is already to do nearest-neighbor downscaling, which has almost identical effects to using sharp-bilinear(fractionally scaled output without pixel wobble, scaling artifacts in general, perfectly sharp). What the OP wants is a downscaling method that is softer than the current one(to hide aliasing further), not sharper than it. It's already as sharp as it can possibly get on the default -i mode(which is on without the -i flag unless your default config is using -Y or -U). This all said having Sharp-Bilinear added to gamescope would be pretty awesome.
Thanks for correcting me. I don't know a whole lot about this stuff yet.
I think all that's needed here is an optional smoothing filter(0-100 scale with 100 being smoothest/blurriest) like the one included with Nvidia's DSR and DLDSR on windows.
There's an issue with downscaling while using fullscreen with a target resolution that's below the monitor's native resolution.
Here's my launch options: gamemoderun gamescope -D -f -W 640 -H 480 -w 2560 -h 1920 -- %command%
When you are not fullscreen this seems to be working correctly. However, when you go into fullscreen, it seems to be scaling 2560x1920 to 1920x1080(native) instead of my target of 640x480.
I also tried the launch options -D and -i and it does not do downscaling with integer scaling together.
Here's some images: https://imgsli.com/MTUyODA0/0/1 https://imgsli.com/MTUyODQ5
There is a lot of what looks like texture aliasing or bright/dark pixels that don't seem to be an average of the samples from the higher res image. We may have a better result with box sampling or somthing other than bicubic. You can go through the extra trouble of getting all these samples but some of them are getting ignored because now there are too many samples for bicubic sampling in some cases.
https://en.wikipedia.org/wiki/Image_scaling#Box_sampling
https://community.topazlabs.com/t/downscale-using-perceptually-based-downscaling-of-images/28675
Doesn't happen to me. Make sure your monitor is set to 640x480 from display settings before running that.
Want to issue a retraction of the following: "which has almost identical effects to using sharp-bilinear(fractionally scaled output without pixel wobble, scaling artifacts in general, perfectly sharp)." It still has pixel wobble, unlike sharp-bilinear. I just wasn't noticing it.
I've got a comment of value to add: the best downscale quality is achieved with B-Spline, which is a mitchell-netravali filter(param B1 C0). It only adds blur, no aliasing/blocking/ringing. The only problem is it only really works well to downscale by half or less. If you downscale by quarter for example it would be too blurry. If Gamescope could allow 2 scaling passes then we could pre-downscale to double our display resolution with integer nearest neighbor downscaling(which only adds aliasing, no blur/blocking/ringing), then downscale by 50% from there with B-Spline for the maximum quality.
example of how the command could look like after such an implementation for downscaling 8k to 1080p: gamescope w1 7680 h1 4320 -n(telling gamescope to scale by this filter to the next nested resolution) -i w2 3840 h2 2160 -D -i -W 1920 -H 1080
This would create an initial nested resolution of 7680x4320 for the game to see, then downscale that by integer nearest neighbor to another nested resolution on top of that(game doesn't see this, it still only sees 7680x4320) of 3840x2160p, then finally downscale using integer B-Spline to our actual display resolution of 1920x1080. This should result in one of the highest possible quality downscale results.
I think this also provides an answer to what specfreq has asked about. It would allow downscaling to a resolution lower than the output resolution, and then upscaling to the output resolution.
Hey all, thanks a lot for the feedback! I released an improved version of the PR.
Here's the ranges of are shown below, the numbers represent the resolution multiplier from the monitor resolution and the render resolution, e.g., render at 2160p and display in 1080p == 2160/1080 == 2.
You can toggle the downscaling effect with super + K
Hey all, thanks a lot for the feedback! I released an improved version of the PR.
Here's the ranges of are shown below, the numbers represent the resolution multiplier from the monitor resolution and the render resolution, e.g., render at 2160p and display in 1080p == 2160/1080 == 2.
* Range of [1,2) uses FSR Bilinear * Range of [2,3) uses Catmull-Rom with a mean of 4 samples * Range of [3,4) uses Catmull-Rom with a mean of 9 samples * Range of (4,inf) uses Catmull-Rom with a mean of 16 samples
You can toggle the downscaling effect with
super + K
Is there a way to make the xrandr command: xrandr --output "HDMI-0" --mode 1920x1080 --panning 3840x2160 --scale 2x2 apply as a launch option for games? Im not sure if it can auto reset after closing down but lutris has an option for that. When I put in the command the game wont boot. It works just fine as a script however.
@LethalManBoob This would change the resolution for everything (not only for your game). You can run the game in 4k with gamescope with gamescope -w 3840 -h 2160 -W 1920 -H 1080 -- /path/to/game/exe
, this will make only the game run in 4k and not mess up your desktop resolution.
@LethalManBoob This would change the resolution for everything (not only for your game). You can run the game in 4k with gamescope with gamescope -w 3840 -h 2160 -W 1920 -H 1080 -- /path/to/game/exe`, this will make only the game run in 4k and not mess up your desktop resolution.
I am okay with it changing the resolution of my whole desktop since I can manually change it back by clicking a script file I made to auto-change the res to 4k and the other to go back to native 1080p.
Yes, it's cumbersome but, for an Nvidia user, it's just how things are. Gamescope is out of the question for now. Considering the frame stutter issue on Wayland and the lack of fullscreen gamescope support.
Gamescope supports fullscreen just fine... Win + F or --fullscreen.
I am not aware of any frame stutter issue
Oh I see, something NV specific, my mistake.
For me Gamescope often works on fullscreen, but sometimes freezes(have to kill the gamescope-wl process). I have a 3080. Xrandr already has better SSAA/downscaling than gamescope upstream by default, probably bilinear downscaling. ruanformigoni's pull request would make it higher quality than xrandr's.
Afaik there is a way to scale the image to the nearest integer of the base resolution and use bilinear filtering to finalize the smoothness without too much blur. I think its really only good for upscaling or 4k downsampling.
Hi,
I have updated the PR for this issue:
I have some questions about this @Joshua-Ashton , would you like to have optional sub-options for the '-D' flag to adjust these settings? E.g.:
gamescope -D my-cmd
// Default B=.2 & C=.3
gamescope -D1.0,0.0 my-cmd
// B=1.0 & C=.0
Another option would be to use environment variables, e.g., GAMESCOPE_BICUBIC_B=.3
& GAMESCOPE_BICUBIC_C=.2
.
About the current implementation, raw bilinear filters deliver subpar quality when compared to the bilinear result of FSR, and bicubic is too strong for under 2x downscaling, so I've set FSR to work as a bilinear filter when downscaling under 2X the resolution.
Would you know how to print debug information of the .comp
shader files at runtime? I had tried this, but could not get an output in the terminal.
I propose changing the flag from -D to -B(for bicubic, since the -B flag is not used, bilinear is the default and doesn't use a flag) to more accurately reflect the scope of what this is. -D implies that it's only for downscaling, when in reality the current implementation already works for both upscaling and downscaling, and(in my subjective opinion) yields better image quality for upscaling than the 4 currently available options(nearest neighbor -n, bilinear which is default with no flag, NIS -Y, FSR -U). Particularly when we are talking B-Spline(B1 C0) and small up/downscales of 2x or less.
It is also adjustable to suit the user's needs, with a vast range of possible appearances through adjusting the B and C parameters. While I prefer B-Spline(B1 C0), most probably prefer Mitchell(B0.33, C0.33), and some might prefer Hermite(B0 C0) or Catmull-Rom(B0 C0.5). The preferences may vary from game to game as well. Probably people used to nearest neighbor on their pixel art games will use Hermite to get a similar appearance to nearest neighbor without having to stick to integers(when integer scaling results in windowboxing).
On the subject of the parameters:
gamescope -B 1.0,0.0
is my vote
The space should be there so it reads less confusingly and acts like the -w/-W and -h/-H arguments.
Also, leaving no arguments(just gamescope -B
) should default to the Mitchell-Netravali recommendation of B0.33 C0.33, not B0.2 C0.3
https://en.wikipedia.org/wiki/Mitchell%E2%80%93Netravali_filters
Hope this helps, and it's looking great. Can I use the parameters right now if I build from your repo? I'd love to test.
Hi @DisplayTalk , thanks for the feedback! I've finished the implementation of the configurable parameters. Example:
gamescope --bicubic=0.5,0.5 -W 2560 -H 1080 -w 5120 -h 2160 -- vkcube
# Or this
gamescope -D 0.5,0.5 -W 2560 -H 1080 -w 5120 -h 2160 -- vkcube
I've also set the default to .33 to both B & C, the short parameter passing is still '-D' for the moment. I've chosen this because it is a mixture of Bicubic and the fsr bilinear filter to get the best results, since the bicubic algorithm seems too strong for downscaling under 2X.
If you want to experiment with only bicubic, you can comment out these lines, and rebuild:
Seems to work fine for downscaling but for upscaling this filter/shader is broken for now. The following is with the FSR/bicubic hybrid functionality removed as you instructed(which seems like no loss when using b1 c0 for downscaling)
Momodora 3 scaling from 320x240 to my pc crt's desktop resolution of 1240x930 with --bicubic=0.0,0.0(Hermite, which is supposed to be very sharp, nothing like it looks here)
Default scaling of gamescope which is bilinear filtering, everything else kept the same
Retroarch Bicubic shader set to the same parameters on Cave Story(also a 320x240 native game), scaling to the same resolution
To give you what I think is a clue as to what may be wrong: here is that same shot on retroarch Cave Story but with the scale on the Bicubic shader manually set incorrectly to 1x scale(normally it operates at the scale of the monitor resolution, not 1x scale of the content) and b1 c0. Not quite as blurry and pixelated as the result in this implementation, but closer to what it's looking like.
Edit: I'd also like to add that I'm not sure the configurable parameters are actually working when downscaling. It looks kinda ringy even on b1 c0. Idk, maybe it's just hl2 looking a bit ringy on textures
Hi @DisplayTalk , I'm trying to port the code from retroarch, to see how it looks in comparison to what is currently implemented. I've also implemented another method for the bicubic calculations showed here. You can try it out by replacing cs_bicubic.comp
with this file: cs_bicubic.comp.bak17.zip. Thanks for the examples!
Are we any closer on this? Id love to game at 4k more via gamescope.
Are we any closer on this? Id love to game at 4k more via gamescope.
Same! GTA 4 looks really good in screenshots in the pull request.
I miss AMD VSR from Windows. It has the best downscaling quality imo, as it uses the hardware scaler from GPU.
If this can come close to it regarding quality, it would be amazing!
I commented on the PR https://github.com/ValveSoftware/gamescope/pull/740 but I'll add my comment here: I'd love to have this for older games! Been playing ME trilogy for example and it'd be really nice to push it to 1.5x or 2.x scale and get some nice AA out of it.
Greetings,
Would it be possible to implement something like
RSR
orDLDSR
to improve image quality on down-scaling? I've tested gamescope with a virtual resolution in xorg, and it has better quality than the downscaling done by gamescopexrandr --output "HDMI-A-0" --mode 2560x1080 --panning 5120x2160 --scale 2x2
. Here is an example with Need for speed pro street.Take the image below as an example, this was running at 10240x4320 (both window and internal resolution). The aliasing seems to not see much improvement over the 2560x1080 counterpart, in the power lines, light rails and the tower to the right.
However, when zoomed in, one can see that the improvement was incredible, but the down-scaling process causes additional aliasing.
Edit:
I've done some research, and played around with the code. Here are some resources:
[1] https://registry.khronos.org/vulkan/specs/1.3-extensions/man/html/VkSamplerCreateInfo.html [2] https://registry.khronos.org/vulkan/specs/1.3-extensions/man/html/VkFilter.html [3] https://vulkan-tutorial.com/Texture_mapping/Image_view_and_sampler [4] https://towardsdatascience.com/image-processing-image-scaling-algorithms-ae29aaa6b36c [5] https://stackoverflow.com/questions/13501081/efficient-bicubic-filtering-code-in-glsl
Maybe it is possible to enable
bilinear
orbicubic
interpolation with thesamplerInfo.minFilter
setting.Also, is there a way to do something like
xrandr --output "HDMI-A-0" --mode 2560x1080 --panning 5120x2160 --scale 2x2
in the nested X session? That would improve the final quality as shown in the first comparison.I got close to make it compile today with the changes in panning:
But there are linking issues with the
xrandr
header#include <X11/extensions/Xrandr.h>
:Compile output
FAILED: gamescope c++ -o gamescope gamescope.p/meson-generated_.._protocol_gamescope-xwayland-protocol.c.o gamescope.p/meson-generated_.._protocol_gamescope-pipewire-protocol.c.o gamescope.p/meson-generated_.._protocol_gamescope-input-method-protocol.c.o gamescope.p/meson-generated_.._protocol_gamescope-tearing-control-unstable-v1-protocol.c.o gamescope.p/src_steamcompmgr.cpp.o gamescope.p/src_main.cpp.o gamescope.p/src_wlserver.cpp.o gamescope.p/src_drm.cpp.o gamescope.p/src_modegen.cpp.o gamescope.p/src_sdlwindow.cpp.o gamescope.p/src_vblankmanager.cpp.o gamescope.p/src_rendervulkan.cpp.o gamescope.p/src_log.cpp.o gamescope.p/src_ime.cpp.o gamescope.p/src_mangoapp.cpp.o gamescope.p/src_pipewire.cpp.o -Wl,--as-needed -Wl,--no-undefined -Wl,--start-group subprojects/wlroots/libwlroots.a subprojects/libliftoff/libliftoff.a /usr/lib/libX11.so /usr/lib/libXdamage.so /usr/lib/libXfixes.so /usr/lib/libXcomposite.so /usr/lib/libXrender.so /usr/lib/libXext.so /usr/lib/libXxf86vm.so /usr/lib/libXRes.so /usr/lib/libdrm.so /usr/lib/libwayland-server.so /usr/lib/libxkbcommon.so -pthread /usr/lib/libSDL2.so /usr/lib/libudev.so /usr/lib/libpixman-1.so -lm -lrt /usr/lib/libinput.so /usr/lib/libwayland-client.so /usr/lib/libseat.so /usr/lib/libxcb.so /usr/lib/libxcb-composite.so /usr/lib/libxcb-icccm.so /usr/lib/libxcb-render.so /usr/lib/libxcb-res.so /usr/lib/libxcb-xfixes.so /usr/lib/libxcb-errors.so /usr/lib/libvulkan.so /usr/lib/libXtst.so /usr/lib/libcap.so /usr/lib/libpipewire-0.3.so -Wl,--end-group /sbin/ld: gamescope.p/src_wlserver.cpp.o: in function `gamescope_xwayland_server_t::gamescope_xwayland_server_t(wl_display*)': /home/ruan/Repositories/gamescope/build/../src/wlserver.cpp:766: undefined reference to `XRRGetScreenResourcesCurrent' /sbin/ld: /home/ruan/Repositories/gamescope/build/../src/wlserver.cpp:768: undefined reference to `XRRGetPanning' /sbin/ld: /home/ruan/Repositories/gamescope/build/../src/wlserver.cpp:771: undefined reference to `XRRSetPanning' collect2: error: ld returned 1 exit status ninja: build stopped: subcommand failed.Managed to make it compile with
dep_xrandr = dependency('xrandr')
, however it segfaults.