Closed vially closed 7 years ago
Overall looks good. Maybe add documentation in README as well.
Tried testing this on my machine with both Way Cooler and Sway, got the same error from wlc:
Failed to activate vt1 for restoration
Falide to switch back to vt1
I made sure I installed the drivers properly, because I can use Gnome + Xorg properly with this driver loaded. Using Nvidia gtx1070 card, not that it probably matters.
EDIT: Same effect from the example
compositor. More of the error message that might be relevant:
Failed to get drm resources
drmModeGetResources failed
@Timidger Do you have Nvidia's DRM KMS module enabled? It is unstable and disabled by default, but needed for EGL. Use modprobe -r nvidia-drm ; modprobe nvidia-drm modeset=1
if you don't.
@Cloudef I've addressed your comments and pushed the changes.
I noticed a very strange thing though. It seems I can still run the example
using the GBM
buffer API instead of EGL streams
on the NVIDIA proprietary driver but it has very poor performance. Any idea how this might be possible given that the proprietary NVIDIA driver never advertised that they support the GBM
API?
@4e554c4c Thanks for the hint, I did have that disabled...but now it seems I have a screwy setup since it keeps segfaulting and after checking the core dump it seems it's trying to use nouveau still....oh well it's about time I switched distros on this machine anyways :P. Thanks for all your help.
@vially I'm not sure. Maybe it fallbacks to some software implementation. Do you happen to have gbm compatible gpu in your pc also?
@Cloudef Yes, I do have an Intel CPU with integrated graphics. Although /dev/dri
only lists one card and the monitor is connected to the NVIDIA HDMI port so I'm not sure how that would work. Strange...
@Timidger I've tried running the example
on my main work PC which is a multi-monitor setup driven by a GTX 970 and it seems to crash as well. I'll try to look more closely into it to see what might be the problem (although it seems to work just fine on my other PC which has a single monitor and a GTX 750 card). I remember from some early test runs that all outputs were assigned the same CRTC id, but I'm not sure if that's still the case using the latest code. I'll have to try and see.
EDIT: There was indeed a bug in the code which made all the outputs be assigned the same CRTC. I've fixed it by using a more robust approach when selecting the output CRTC and now the example
code seems to run on my multi-monitor setup too.
Update: I've been able to run sway
successfully without any additional changes.
Lets merge and see what happens.
@vially are you happy that you've now contributed to the segmentation of the entire Linux desktop? By merging this no problem has been solved, but you've worsened an existing one. Because Nvidia couldn't admit they weren't the first and didn't want to use an existing, nice API created by Intel they NIH'd this one, and now it falls onto projects to support both? This is absolute crap, and instead of yielding to Nvidia you should have send them an angry email like this one to support GBM. Or better yet a PR, though don't count on it being merged because they'd rather NIH it. Now instead of only 1 compositor implementing this piece of crap we have 2, and that's alot more than 0.
@atomnuker I'm sorry you feel this way. I don't like the current situation either with having two competing buffer APIs. But I'm also trying to be pragmatic and I thought it might be useful if we had wlc
working on proprietary nVidia drivers until the nVidia and Wayland developers decide on a single buffer API.
I also happen to own an nVidia card which is not supported by the nouveau driver so I really didn't have any other option if I wanted to use Wayland.
@vially
Are you using glvnd enabled mesa?, if so /usr/share/glvnd/egl_vendor.d/50_mesa.json overrides /usr/share/glvnd/egl_vendor.d/10_nvidia.json
Renaming 10_nvidia.json to 90_nvidia.json should correct it.
I noticed a very strange thing though. It seems I can still run the example using the GBM buffer API instead of EGL streams on the NVIDIA proprietary driver but it has very poor performance. Any idea how this might be possible given that the proprietary NVIDIA driver never advertised that they support the GBM API?
@leigh123linux Thanks, I wasn't aware of that. I'll experiment with that to see if it makes any difference.
I have the same issue on 4.16.3-1-ARCH
, modeset = 1
option is enabled in the DKMS however it works on every SECOND boot flawlessly. Something is not released on shutdown?
Add support for EGL streams used by the proprietary NVIDIA driver: #138
The current implementation selects the EGL stream buffer API (instead of the default GBM one) when the
WLC_USE_EGLDEVICE
environment variable is not empty (e.g.:export WLC_USE_EGLDEVICE=1
is needed at runtime).I've only tested it by running the
example
compositor on a single monitor setup (Archlinux using latest NVIDIA drivers) and it seems to work.Disclaimer: I don't know
C
, so most probably the code contains lots of bugs so please review very carefully (especially around memory allocations/deallocations).