Closed ghost closed 2 years ago
This seems to be a common issue with NVIDIA graphics cards (I think there's at least another report like this with NVIDIA), sadly I do not have a dedicated NVIDIA graphics card available to test this properly and I cannot seem to reproduce this on any of my AMD systems.
Compiling irrlicht manually should be really straightforward, this has been needed for Fedora already, and could allow some extra testing if you feel like you can dig deeper into this: https://github.com/Almamu/linux-wallpaperengine/issues/16#issuecomment-629167606
Good to hear that other backgrounds look okay! Hopefully with the data-separation branch most of them start working properly once I finish the rewrite (that is, If I have enough time)
With the recent changes on how the window is created and handled (and getting rid of irrlicht) this issue should (hopefully) be gone. I don't have any NVIDIA cards so I cannot test it tho. Could you give it a try if you're still interested in this project?
I fetched the latest version today from github and when I try to run it without specifying a "screen-root" I get a segmentation fault. When I try to let it render on the integrated display of my laptop I get "X Error of failed request: BadMatch (invalid parameter attributes)"
here's a screenshot of the terminal containing the output from "wallengine":
If I can help testing too, I would like to help testing on nvidia mobile hardware as my laptop runs a "GeForce RTX2080 with Max Q Design" in cojunction with some Intel onboard graphics (which is, in my software setup, only used to get the rendered video data to the built in screen).
If you want me to try something just let me know.
I fetched the latest version today from github and when I try to run it without specifying a "screen-root" I get a segmentation fault. When I try to let it render on the integrated display of my laptop I get "X Error of failed request: BadMatch (invalid parameter attributes)"
here's a screenshot of the terminal containing the output from "wallengine":
If I can help testing too, I would like to help testing on nvidia mobile hardware as my laptop runs a "GeForce RTX2080 with Max Q Design" in cojunction with some Intel onboard graphics (which is, in my software setup, only used to get the rendered video data to the built in screen).
If you want me to try something just let me know.
About the segfaults, I think I know where they're coming from, I'll update the README.md so it lists proper arguments for running it and add checks to make sure it doesn't crash like that. About testing on NVIDIA, for now there's not really much I can say unless you know C++ and can try and debug it or find an alternative way of rendering to the desktop.
ok thank's for the info
This issue should be definitely fixed since commit a9db3ff3642229e3b597a17bc20500a5b18bd73b by @Hynak. Please feel free to re-open this issue if it keeps happening.
nice, got it working after also installing glfw (which I was missing since your opengl version was published) I just tested it with the wallpaper linked here https://steamcommunity.com/sharedfiles/filedetails/?id=2171533743
Thats wonderful! What DE/WM are you using there? Is that a normal Mate installation with customization on top @Martin1995 ? (just want to get some info so I know what environments people have tested the functionality with ;))
It's just standard MATE Desktop with marco running on Arch with the WinME/2k Style window theme and no custom software regarding the DE and WM the only custom things are the set wallpapers and those do not need counting here😊 the only thing I did regarding X is that it's setup to use the nVidia Card only and pipe the Screens Output to the iGPU which is just used to drive my laptop's internal screen (which I can't hardware switch to the dGPU). I did not test it with attached external display's on the nvidia card yet, only with the internal screen which I mostly use.
I just wanted to let you know that I tried dual screen today, with the e-DP (laptop screen) on intel and HDMI (portable external monitor) on nvidia. works also without problems.
Describe the bug Running wallengine succeeds without
--screen-root
, but fails with it enabled.To Reproduce
./wallengine --screen-root DP-0 --pkg $HOME/.steam/steam/steamapps/workshop/content/431960/2154699571
According to xwininfo, the normal wallengine window and the X root window differ by the GLX visual flags
multiSample=8
multiSampleBuffers=1
versus0
and0
respectively. SDL's choice of visual should be settable by the envvarSDL_VIDEO_VISUAL_ID
but in my testing it doesn't honor this flag. The Nvidia Control Panel's "override multisampling" setting also has no effect on any GLX visual flags.It makes no difference whether another desktop compositor is running or not.
Related: https://stackoverflow.com/questions/51400305/irrlicht-fails-to-create-a-glx-context-when-passed-a-sdl2-created-window
(please complete the following information):
On a positive note, https://steamcommunity.com/sharedfiles/filedetails/?id=2154699571 the
waterwaves.frag
effect works with linux-wallpaperengine currently.