Bumblebee-Project / Bumblebee

Bumblebee daemon and client rewritten in C
http://www.bumblebee-project.org/
GNU General Public License v3.0
1.29k stars 142 forks source link

Is this project dead? #947

Open jambonmcyeah opened 6 years ago

jambonmcyeah commented 6 years ago

There have been a year without commits, is this project dead?

axelzaro commented 6 years ago

Yep! seems like...

ArchangeGabriel commented 6 years ago

At this point, I think that we should say so. I don’t think Bumblebee still has any purpose given how much running directly on the NVIDIA card has progressed.

From my POV, they are three possibilities:

  1. You only care about turning off your card, in which case you can use bbswitch or nouveau directly.
  2. You have some outputs wired to your NVIDIA card that you need/want to use, then nouveau + PRIME is the way.
  3. You want to take advantage of your NVIDIA GPU for performances purpose, in which case Bumblebee is a very poor solution and you should just restart you X server on the NVIDIA card directly.

Nowadays, 1. and 2. can be achieved without any configuration, while 3. requires a bit of configuration, though on some distributions (Ubuntu at least I think) an helper exists to do it.

Maybe there is one use case for Bumblebee, which is when the only thing you care about is CUDA/OpenCL and you don’t need eventual outputs wired to the NVIDIA GPU. But not sure if we actually support this use case (haven’t tried) and whether they are a lot of people requiring this.

jambonmcyeah commented 6 years ago

On my laptop, I experience a lot of lag when I start X on my nvidia card. So that's why I'm still using bumblebee

ArchangeGabriel commented 6 years ago

What X.org and driver version? Do you have nvidia-drm loaded?

bluca commented 6 years ago

It's also not very convenient having to kill the session and lose all the state... I hope server-side glvnd in xorg 1.20 will help somewhat

jambonmcyeah commented 6 years ago

I have the latest versions from Arch and have nvidia-drm enabled. I also checked dmesg and that confirms it's enabled. I have a gtx 970M.

ArchangeGabriel commented 6 years ago

@bluca Yes, but if you’re after performances, Bumblebee won’t deliver that, so… I’m not sure how different the performances are between nouveau using PRIME and NVIDIA through Bumblebee.

Not sure what server-side glvnd will change, but in the end the better things would be to get reclocking working with nouveau, which is something @KarolHerbst works on and has even been recruited by RedHat for.

ArchangeGabriel commented 6 years ago

@jambonmcyeah Did you follow https://wiki.archlinux.org/index.php/NVIDIA#DRM_kernel_mode_setting?

jambonmcyeah commented 6 years ago

@ArchangeGabriel Yeah, I followed the wiki. I also tried ubuntu and it also lags but it's not as bad. Is it a problem with gnome?

ArchangeGabriel commented 6 years ago

I don’t know. I haven’t tried NVIDIA driver for ages, I basically only use nouveau for PM and using my HDMI output. I would have chosen a laptop without an NVIDIA card if I had the choice, but they are (at least were) none in my specs range.

gsgatlin commented 6 years ago

Curious about something with respect to performance of nvidia driver on optimus machines.

In the readme for primus bridge that @amonakov wrote it says

"Q: Performance does not exceed 60 fps, I was getting more with optirun/VirtualGL. A: This is the effect of vblank synchronisation. For benchmarking, you can use vblank_mode=0 primusrun ..., but in practice this will probably only waste power, as your LCD panel does not display more than 60 frames per second anyway."

Does using other solutions (non bumblebee) get around the 60 fps limit with the LCD panel ? Or do they go above 60 fps nowadays?

Thanks a lot.

ArchangeGabriel commented 6 years ago

@gsgatlin Not sure whether I understand your question. By default, you get as much FPS as possible, but limited to vsync which usually means 60 fps (unless you’re using some special laptop or an external screen connected the right way that is 120 Hz or more). If your question is whether other solution than Bumblebee respects vsync, then I don’t know.

gsgatlin commented 6 years ago

@ArchangeGabriel Ok. Thanks. I was just wondering if 60fps was like a fundamental limit with the LCD display. Like the nvidia card draws faster then then 60fps, but you can only see 60fps slices of that due to LCD panel hardware limitation. I have used nvidia-prime on ubuntu before. When i try it now with "prime-select" glxgears -info says I am getting 7454.916 fps on a GF108GLM nvidia card. (DELL Latitude E6530 laptop)

But I am unsure of noticing a difference when using bumblebee instead of "prime-select" I guess I would need to try some games instead of glxgears.

I will say not having to log out log in again is a bonus. But if the project is dead, its dead. :) Maybe nvidia or the xorg guys will fix that soon so you can just use prime and not have to logout to switch.

ArchangeGabriel commented 6 years ago

60fps is a LCD hardware limitation, yes. Your card may very well render 1Mfps, your display wont show more than 60.

glxgears is indeed not a benchmark, you should better try some Unigine demo for instance.

And yes, in the end it should be NVIDIA responsibility to fix this situation. It might happen some day, though I wouldn’t bet on which of this vs. proper reclocking in nouveau will land first. ;) Might be GPU dependent though, since newer ones are very badly locked from what I understood at FOSDEM earlier this year.

zfpsupport commented 6 years ago

@ArchangeGabriel Thanks for the info on Bumblebee status on this thread.

I did quite some extensive tests on best way to deal with nvidia card, power-consumption, optimus, needs for CUDA as well as gaming on Linux on Alienware 13 r3 laptop.

Bumblebee is still the best option for me to have my laptop to use less battery as possible when discrete card is not needed, use discrete card on a specific game when I want max-perf, use the discrete card with CUDA enabled apps without the need to restart X session.

Not sure if this count as use case! But, I'm also not sure if there is a better solution for my needs than using bumblebee...

Thanks.

gsgatlin commented 6 years ago

Yeah. Due to the 60fps limit of most LCD laptop screens I'll probably keep using bumblebee for myself on fedora/centos until bbswitch/bumblebee/primus won't work anymore or until nvidia fixes their out of tree driver to work more like how it works on windows. :) But I sure do appreciate all the work you guys did to make this project a reality.

Its not reasonable to expect someone to keep coding something that they have no reason to use anymore. :) Although someone can always fork if they have some itch to scratch. nouveau works for a lot of users nowadays for power saving purposes and driving a external monitor. A big improvement from how it used to be a few years ago for sure. Maybe someday it will be able to compete with the closed source driver which would be awesome. :)

Thulinma commented 6 years ago

The reason there have been no commits is simply that no commits have been needed. I've personally seen Bumblebee as "complete" for a long time now. It works, and should continue to work for the foreseeable future. I continue to use Bumblebee every day myself (and have used it on all my laptops since originally working on it), and have never come across anything I felt like needed changing/updating - otherwise I would have.

Bumblebee was always meant to be a temporary solution until a proper/official solution presented itself. I am both happy and sad that it continues to be useful to this day. (Happy because, well, it's still serves a purpose; sad because a proper solution should really have been available by now.) It serves its purpose, however, and I don't feel like there is a need to improve it beyond its current state.

Of course, other devs may feel differently about this, but I have a feeling these thoughts are probably shared between all/most of them.

DRosky commented 6 years ago

While bumblebee may not be the highest performance solution, it is arguably the most convenient, which can matter sometimes.

There are two use cases that make bumblebee the most attractive option for me. The first is doing intermittent 3D modeling along side other work, often on battery power. This is a different use case than games, in terms of performance. Raw FPS isn't as important as how fast the GPU can calculate a complex scene, and the Nvidia GPU is significantly faster than Intel. Since the modeling is intermittent, a solution that requires the Nvidia GPU to be on constantly (e.g., PRIME) would involve a significant hit in battery run time, though it would be fine when plugged in.

The second is running CUDA applications. I've read that this can be done without bumblebee and that bumblebee may be overkill for it, though it is convenient.

I admit these are probably corner use cases, and don't by themselves justify the continuation of the project beyond where it is right now.

DRosky commented 6 years ago

As an addendum to the last comment, ultimately, the whole Optimus approach is about improving battery life and balancing performance with battery life. So far, Nvidia has chosen not to support this goal on Linux, and bumblebee comes about as close as is possible without direct support. In my own case, it has made it possible to continue using Linux for what I do on my laptop, and I'm grateful for its existence, at least until there is proper support from Nvidia.

ghost commented 6 years ago

I would help support it, I've got some free time on the weekends. I just would need a walk through from the current maintainers on how to help out.Kind regards,Kyle Bogdan From: notifications@github.comSent: March 23, 2018 9:26 PMTo: Bumblebee@noreply.github.comReply-to: reply@reply.github.comCc: subscribed@noreply.github.comSubject: Re: [Bumblebee-Project/Bumblebee] Is this project dead? (#947) As an addendum to the last comment, ultimately, the whole Optimus approach is about improving battery life and balancing performance with battery life. So far, Nvidia has chosen not to support this goal on Linux, and bumblebee comes about as close as is possible without direct support. In my own case, it has made it possible to continue using Linux for what I do, and I'm grateful for its existence.

—You are receiving this because you are subscribed to this thread.Reply to this email directly, view it on GitHub, or mute the thread.

jambonmcyeah commented 6 years ago

Well, I think the most important thing for bumblebee now is vulkan support

Lekensteyn commented 6 years ago

Unless someone contributes it, I doubt there will be Vulkan support.

Queuecumber commented 6 years ago

So here's my question then.

I have an nvidia GPU attached to my laptop via thunderbolt. I want to use that GPU to play a game. Concretely, what are my options assuming that bumblebee is dead?

Thulinma commented 6 years ago

@Queuecumber Assuming you have the card up and running at all (e.g. loaded drivers correctly etc), Bumblebee will let you use it. Just not for Vulkan-based games (only OpenGL). The project is not "dead", it's "done". You'll need to tweak the config files since that kind of setup won't work out of the box, but it shouldn't be too hard to get it working.

DRosky commented 6 years ago

I would say it's provisionally "done". It was arguably done three years ago, but with games ultimately moving to Vulkan and bbswitch not supporting recent kernels' default power management and thus requiring pcie_port_pm=off, it is gradually becoming less done. (I'm considering bbswitch as being part of the overall bumblebee system.)

Right now there are still workarounds for many of the power management issues and most games are still OpenGL, and while bumblebee no longer works "out of the box" in most cases, it still can be made to work most of the time. But if a big shift happens, like pcie_port_pm becomes deprecated or not possible to turn off due to other system needs, or when games finally move to Vulkan in larger numbers, or any number of other unforeseen problems, then the bumblebee project as it exists will become unusable for many use cases. At that point, if it doesn't adapt, it will become truly "dead".

The devs have pretty clearly stated that they view that bumblebee is not the future of using the proprietary Nvidia driver on Optimus laptops, and that its usefulness has more or less come to an end due to the emergence of other options, so this is an accepted outcome due to those other options.

IMHO, while other options like Nvidia PRIME work, and are higher performance, they are not actually in the spirit of Optimus, which was to maximize battery life by turning the GPU on and off at will. Bumblebee is still the only option that is in that spirit, even if not as much so as is possible in Windows. For my own use cases, I will miss that when it finally becomes unusable for me.

JeffLabonte commented 6 years ago

Well, it works on Fedora 28. But still is the project ever going to come back to life. It is the best option that we have so far, at least on Nvidia!

enricotagliavini commented 6 years ago

Sorry to resurrect this after a quite long time. Just one to point out one motivation for Bumblebee that so far I didn't see mentioned. My primary motivation for using bumblebee, which is also why I don't buy a desktop, buy I stay on laptops, is that I can use the Intel driver to run the desktop. I hear so many horror stories in IRC and the like when using the Nvidia driver to run the desktop. Not to mention that plain xrandr seems to still be kind of awkward. Ultimately I also don't want to hear the answer "sorry we don't support Nvidia driver use nouveau", because a) nouveau cannot even boot my computer correctly and b) it's a huge effort to have it working, if even possible now that you need signed firmware which are basically vaporware.

With the Nvidia solution I should use the Nvidia card all the time since they don't support offloading on Linux, and I'm seriously concerned my desktop experience will suffer. Restarting the session to switch card is also not an acceptable option.

I also don't mind the performance I get out of Bumblebee. I believe it's not the best possible, but it's quite good for me and it's sufficient

Zeioth commented 6 years ago

I don't see how this is dead.

Nvidia still doesn't support PRIME GPU offloading with the privative drivers, it only works with nouveau. And as you already know performance is much worse using nouveau. Which makes bumblebee the best option performance wise, right now.

Luckly Nvidia is working on adding support for PRIME to the privative drivers.

EDIT: I just discovered bumblebee doesn't support vukan yet. So while this feature is implemented, nouveau+prime is probably the best option for Nvidia users.

Sangeppato commented 6 years ago

I think that currently the issues with Bumblebee concern OpenGL (performance hit) and Vulkan (no support at all, even if this looks interesting), so mainly Videogames (and maybe 3D modelling?). Anyway there's still an important use-case that Bumblebee can serve perfectly: Parallel computing. For image processing, video editing and any other computational task that can take advantage of the GPU with OpenCL/CUDA, bumblebee works just fine, without any performance penalty. Actually we don't even need Bumblebee in this case, bbswitch is sufficient, but it's easier and faster to run optirun --no-xorg.

So I don't think this project is dead and/or useless, considering bbswitch as a part of it: it still serves me every day and there's not any possible substitute.

adhefe commented 5 years ago

Bumblebee seems to have been broken. I'm using nvidia driver 390.87 and kernel 4.19.10 and now the command,

$ optirun glxspheres64

seems to take forever and nothing returns, no error message even if -v -debug is used. This command makes nvidia modules to be loaded as expected. Also bumblebeed.service daemon is working fine.

primusrun command also doesn't work but returns something: $ primusrun -c glxspheres64 Polygons in scene: 62464 (61 spheres * 1024 polys/spheres) ERROR in line 619: Could not open display

Someone else has publish this (I couldn't set bumblebee to work following this...) https://github.com/Bumblebee-Project/Bumblebee/issues/951

I'm using Openmandriva linux 3.03.

enricotagliavini commented 4 years ago

Just for everybody information: in case you missed it Nvidia now officially supports PRIME offloading on Linux, but only with newer drivers (435.something and newer). I tested it myself during the holidays and I'm impressed, it worked quite well. The only non supported config I found was when the HDMI port was wired to the Nvidia card. The port cannot be used this way, but I've read this is an Xorg limitation.... no idea about Wayland. Still offloading was working perfectly fine with the builtin screen, but to use the HDMI I have to switch to Nvidia being the primary GPU the whole time (so not just used when offloading). Not a big problem for me as I almost never use the HDMI port to game anyway and there are chances the Thunderbolt port is actually wired to the iGPU, will try that when I have a chance.

So we can almost say bumblebee can be sunset now, this is only needed for cards not supported by the 435 driver series and newer.

If using Fedora just install the Nvidia drivers from RPMFusion and you will be good to go. More info at https://rpmfusion.org/Howto/Optimus

bluca commented 4 years ago

Unfortunately it's not quite that simple, as the new solution only supports power saving with Turing and newer cards, as it relies on the nvidia kernel modules for changing power state. Many many cards supported by the 435 driver will get no power saving whatsoever with Nvidia's solution, as of now, even though offloading works perfectly fine - the card will always stay active and powered on.

Once wayland/egl is supported (iirc it wasn't in the first iteration), in theory it should be possible to use xwayland "on demand" - an experimental feature of Gnome 3.34 - together with bumblebee in no-bridge mode to achieve power saving - only when using X applications of course.

enricotagliavini commented 4 years ago

According to the RPMfusion wiki EGL is supported with driver 440.26 and newer. I didn't try wayland yet, only Xorg.

As for powersave I'm not sure to be honest. As suggested by the RPMfusion wiki I set the NVreg_DynamicPowerManagement=0x02 flag for the nvidia kernel module, which will enable powersave also when Xorg is constantly using the GPU. I don't see any difference when comparing with bumblebee and bbswitch in power consumption as reported by powertop. At least for me there is no regression.

nvidia-smi reports 4 watts being used though, so you might be right it doesn't work, but then it was not working with bumblebee as well.

bluca commented 4 years ago

From the dynamic power management upstream page:

http://download.nvidia.com/XFree86/Linux-x86_64/440.31/README/dynamicpowermanagement.html

This feature requires a Turing or newer GPU.

What you are experiencing is power scaling to the lowest state, which is different from completely turning off the card which bbswitch achieves for any model.