Bumblebee-Project / Bumblebee

Bumblebee daemon and client rewritten in C
http://www.bumblebee-project.org/
GNU General Public License v3.0
1.3k stars 142 forks source link

Bumblebee for Desktop #577

Open Bugattikid2012 opened 10 years ago

Bugattikid2012 commented 10 years ago

$ optirun [ 641.107275] [ERROR]The Bumblebee daemon has not been started yet or the socket path /var/run/bumblebee.socket was incorrect. [ 641.107374] [ERROR]Could not connect to bumblebee daemon - is it running?

That is the error I recieve. I've tried re-installing nvidia and bumblebee countless times, I've tried different ways of installing and everything (though I haven't tried a manual install, but it probably won't make a difference). To sum it up, I've done everything on this forum and more.

http://askubuntu.com/questions/454729/installed-14-04-and-nvidia-graphics-card-wont-load

I've look and looked on the internet, and it seems that nothing will work for me. The only solution I can think of is if I can get a version before 3.1 of this project. I've tried using the terminal install listed here

http://www.howtogeek.com/117929/how-to-downgrade-packages-on-ubuntu/

but it seems only the latest version is available. I NEED this to work, as I can't even get my main monitor to work without it. Please help! Thanks.

Specs:

MSI 780 Ti Gaming

i7 4770k

16 gigs ram

SSD for boot and games that need it HDD for storage

Dual booting Ubuntu 14.04 with Windows 7 (or Winderp$ as I like to call it)

amonakov commented 10 years ago

Why are you trying to set up Bumblebee on a desktop? Any reason you can't just plug your monitors into the nVidia GPU directly?

Bugattikid2012 commented 10 years ago

My main monitor is. I'm wanting to use the dedicated card for games and stuff, and the integrated for lighter processes so it won't be heating up the main GPU. In other words, I'll have faster cool down times this way.

Bugattikid2012 commented 10 years ago

Besides, if I did that I wouldn't have any video output.

amonakov commented 10 years ago

I'm still unclear on what you're trying to achieve. Note that Bumblebee might not be the right tool for the job.

As for the optirun failure, the error message tell you that bumblebee daemon is not active. You probably need to read system logs (in /var/log, usually /var/log/messages) to see why it failed to load.

Bugattikid2012 commented 10 years ago

I'm pretty sure they're empty as far as bumblebee logs go.

What I'm attemting to do is to go into my Nvidia settings and be able to launch applications using my dedicated card. As of now when I do nvidia-settings I do not get the full page. This is a common problem, and you have to type optirun nvidia-settings -c :8 to get it to show all of the settings. Because the optirun and primus programs will not work, I am unable to use my main monitor or do any gaming on my Linux partition. I have to find a fix for this, or I can't use Linux at all.

ArchangeGabriel commented 9 years ago

Bumblebee can’t work on desktop, because bbswitch won’t load. We might consider making bbswitch an optdepend, but I’m not sure it’s worth it.

Bugattikid2012 commented 9 years ago

How much effort would it actually be? I don't understand what exactly is holding you back, there's no such thing as a different OS for a laptop vs for a desktop. They function the same way, unless Linux is just weird like that. Laptop's aren't lacking anything desktops have, they're just smaller and are designed to use less power.

ArchangeGabriel commented 9 years ago

The difference is on BIOS/ACPI side. Laptop have specific functions there for Optimus that desktop don’t have, and we use them in bbswitch.

What happen in your case must be that Bumblebee starts, loads bbswitch, bbswitch try to find those functions, fail to do so, so fail to load, which leads Bumblebee to abort loading — thus explaining why you can’t connect to the daemon.

I’m not sure that’s a lot of effort, we must mainly change how Bumblebee depends on bbswitch to make it optional, but care about reporting appropriate error if bbswitch failed to load on a laptop, and about packaging so that bbswitch is still a default for laptop while not being installed for desktop (I suppose that for this part, making a bumblebee-desktop package is the right way).

What is holding me is more that, like @amonakov, I’m not sure Bumblebee is the right tool for this, and perhaps it already exists something that handle this case of hybrid GPU for desktop. I need to find documentation about this.

Bugattikid2012 commented 9 years ago

Thank you for your interest. I also wasn't expecting BIOS/UEFI to be the limiting factor here, but that does make more sense.

I guess I should clarify what I am attempting to do. My plan was/is to have my main monitor running from my dedicated card, and my second monitor running from my integrated card. There is a noticeable difference when my dedicated card is not running two monitors, and just one. It stops a bit of stuttering and sometimes results in upwards of 15 FPS differences. Also, if my dedicated card is getting noticeably hot, I can run off of my integrated card and second monitor for a few minutes while I wait for my dedicated card to cool off. It won't be doing much of anything if the computing is going on at the integrated card, which results in a noticeably faster cool down time.

It sounds like this is going to the extreme details, and admittedly it sort of is. However, I'm not talking about 1 FPS difference, it's a nice difference that results in a decent 15+ FPS boosts, so I think it's worth it. The cool down time isn't that much, but it's still something.

I understand why you were confused with my request, I always knew bumblebee was used for laptops mainly, but I didn't realize they didn't work at all on desktops (usually). I haven't been able to find any alternatives to bumblebee, but if I do find one I will try to post it here. Thank you for your interest and help. If you have anything else that could be of assistance to my issue, it would be greatly appreciated if you alerted me about it.

ArchangeGabriel commented 9 years ago

OK, this is in fact far from what we achieve here. You’re going to have your NVIDIA card powered all the time, one screen on it, one screen on the Intel.

After a little looking around this, even on Windows, Intel relies on a paid software called LucidVertu that they used to bundle with some motherboards, and this software let you decide which card is managing screen(s) (even if it’s not the card where they are plugged in, but in this case at a high cost of performances) and which card is rendering each application. But you can’t run one screen on a card, one screen on the other one. And test around there of LucidVertu says it’s a crap.

What is your setup on Windows?

On Linux, does it work if you plug one screen on each? I mean, are both screens detected and used in this case? I would be very surprised of that.

Here, in Bumblebee multi-monitor setups (that concerns users for which the HDMI output is wired to the NVIDIA chip), you have to start a second X server on the external monitor, and there you can start an other DE/Window Manager, but it’s not very stable, and most important point, since this isn’t the same server/DE/WM, you can’t move window from one screen to the other.

Through PRIME (and its reverse setup), you can choose (ideally, IRL all the drivers, especially NVIDIA, are not working in all modes) which card manage the screens, and which one render each application, but here again you can’t have one screen on each card.

So, AFAIK, there is no existing solution with the two screens being managed each by a different card while still being part of the same desktop, I’m not sure this is technically possible, but even if it is, it must be an huge effort to implement, with quite no interest.

I think you should either let your integrated card manage both screen while doing some rendering using the NVIDIA one (this could be achieved using PRIME maybe, or at least a desktop version of Bumblebee), start an X server on each screen with a different card, each with a DE/WM, but this must be a lot of pain to setup this, or just let your NVIDIA card do everything at this cost of 15 FPS (as long as you’re having 60 or even 30, you’re fine I think).

Bugattikid2012 commented 9 years ago

Thanks for the detailed reply. I'm currently running a 144hz monitor, so the higher the settings while still remaining 144ish FPS is key. (Please don't start talking about some BS about the human eye can't see higher than 30FPS, it's SOOOOO incorrect. If you don't believe me, please kindly ask for some sources and I'll be more than happy to fetch some for you. Also, the eye can benefit from as many FPS as you can throw at it, however there won't be any /significant/ benefit after 75~-150~ FPS, depending on each and every individual. I can vouch that 144 FPS/hz makes the world of difference, even when just moving your mouse around on a desktop.)

On Winderp$, I can run an AMD card on one monitor, and an NVIDIA card on the other. Heck, I could even run one from AMD, one from NVIDIA, and a third from Intel (in theory at least, I don't know if it's actually been done, but I don't see anything limiting it on Winderp$. I can confirm that any combination of two of the three cards will work fine.).

My setup on Winderp$ is exactly as I want it to be on Linux. I'm running my main monitor out of Dual Link DVI-D from my NVIDIA card, and my second one is hooked up through HDMI to the Intel port from my motherboard. It's never given me issues, at least not on this PC. I've had a few issues on my laptop that runs the same way (intel 4000 series HD graphics chip, and a 650M from NVIDIA), but it was resolved pretty easily. Winderp$ update had automatically installed a driver for my Intel chip, and something went wrong. After a little troubleshooting, I found out all I had to do was reinstall the Intel graphics driver. So no /real/ issues on Winderp$.

On Linux, if I have my integrated graphics enabled in my BIOS/UEFI, it's the only thing Linux will detect, and it refuses to admit my NVIDIA card exists. If I disable it, I get some potato resolution around 640x480 and I can't change it very easily. After some tinkering, I can get it to work though. Then, once I enable my integrated card again, Linux prefers it over my dedicated card, despite my BIOS/UEFI settings to use my dedicated card if possible.

I guess I'll just have to leave it as it is for the moment, I haven't even used Linux in months because of this issue.

I don't remember their names, but supposedly there's two new things that are out in the works to replace X-org. I think one of them starts with an M, and Canonical is sort of spear heading that one along if I recall correctly. These obviously won't be perfect for a few years to come, but is it /PROBABLE/ that these things could fix my issue? Will they give more performance? Less bugs? Etc? And how soon until these are estimated to be perfected/patched?

I guess I'll just wait until these two come out and see what they have to offer. Thanks again for your very dedicated help.

ArchangeGabriel commented 9 years ago

OK, didn’t know M$ was having this working, I would be interested to know how they do that.

Currently, AFAIK, this is not possible under Linux, but maybe one day…

The two things you mention are Mir from Canonical, which was expected to land in 13.10 at first, and isn’t expected to land before 16.04 now, so I suppose this will be 18.04 or something like that, and the second one in Wayland. Back when Wayland devel started, people where expecting it to be in use around 2015, but I don’t know what is the state of the project today and know no-one using it, plus I have no idea on how they handle things like multi-GPU and/or multi-monitor.

And in both cases, I think it’s to early to talk about performance and bugs, because I expect both to be not as reliable as X.org today, but the goal of reimplementing a display server is more performance, less bugs, and better maintaining/evolving.

And in 2010, PRIME was expected to work fully in 2012, and today it’s just almost working. So, be prepare to wait a little more I think before being able to use such configuration under Linux.

Bugattikid2012 commented 9 years ago

Wow, it got pushed that far back? I was expecting it to remain /fairly/ close to the original release dates. But 18.04? Dang. Good thing I don't use Ubuntu anymore.

I guess we'll just have to wait and see if Wayland comes out with a fix for this, and that it comes out in the next decade... Stupid Xorg... Relevant XKCD: http://xkcd.com/963/ Don't forget the alt text!

Also this one: http://xkcd.com/456/

Thanks for your help. I'm sure one day I'll find/make a fix if one doesn't exist already by then, but I don't have the knowledge to do either at the moment. If I ever find one I'll let you know.

ArchangeGabriel commented 9 years ago

Well, they say it will be 16.04. But the past teached us that every time they give a deadline like this, it’s next cycle.

wilfm commented 8 years ago

Nearly the same for Wayland as well....

Anyway, are there any desktop boards that implement BIOS/ACPI similar to laptops, so that Bumblebee (or any other hybrid graphics optimisation stuff) will work? (sorry I'm a bit of a noob in this area :)

Bugattikid2012 commented 8 years ago

I'm not sure about boards, but I recently found out how Arch Linux claims to work with Bumblebee for desktops.

wilfm commented 8 years ago

Ok :). Just I'm interested due to a need to upgrade my desktop (but don't want it to use loads of power all the time), and I have found that some desktop boards seem to have a some weird graphics implementations available (e.g. LucidLogiz Virtu: http://www.gigabyte.com/MicroSite/279/images/mb-z68-built-in-visuals.html), but possibly under Windowz only.

With NVIDIA's 'Optimus', they apparently were going to launch for desktop (http://news.softpedia.com/news/NVIDIA-Optimus-Lands-on-Desktops-196761.shtml) but never did.

dvaerum commented 8 years ago

I can confirm that it worked under ArchLinux, but it will stopped working after I put the computer to sleep and resumed, then I had to reboot the computer and it was working again. However it stopped working for me at some point, not sure then and that is way I am reading this Issue. I don't know if it works for other Arch users, but it was nice then it worked for me, because I was saving about 80-100W.

ArchangeGabriel commented 8 years ago

@dvarum12 Is there any chance you still have logs from that time where it worked? If you have a high disk usage and time limit for journalctl, it might (I’m still having full logs from my initial install), in which case we could see what changed, and if there is something we could do about it. Also, how did you measure that power saving?

dvaerum commented 8 years ago

@ArchangeGabriel I dont have the old logs, but I do have bumblebee working again, with steam on ArchLinux :smile: If you can use that for anything?

ArchangeGabriel commented 8 years ago

So, you’re currently having your screen handled by the integrated Intel card, use Bumblebee to run things on the Nvidia card, and get power savings in this situation vs. using the Nvidia card for the display? No bbswitch involved, right?

dvaerum commented 8 years ago

@ArchangeGabriel yes, and considering that I am not a big gamer, but do playing from time to time. This is a nice thing to have. Of course, this is not perfect and there do exist problems, but for me the pros out ways the cons.

ArchangeGabriel commented 8 years ago

Could you give a dmesg log, and Xorg.{0,8}.log too? That sounds interesting.

dvaerum commented 8 years ago

Yes, then I get home I will get that for you.

dvaerum commented 8 years ago

As promised, hope you find something :smiley: dmesg.txt Xorg.0.log.txt Xorg.8.log.txt

ArchangeGabriel commented 8 years ago

OK, thanks. So, Bumblebee does indeed work for desktop as a free bonus since we added support for different PM backend, none being one of the option (thus no failure with bbswitch unable to load, which was the point in this early discussion), however I see no evidence of real powersaving here. We would indeed need a desktop equivalent to Optimus at ACPI level I think for that, unless we can manage perhaps to turn of the PCIe port (not sure whether this is doable for desktop, especially without putting the card off first, but there is at least the idea: https://github.com/Bumblebee-Project/bbswitch/pull/130).

Or maybe there’s some sort of magic somewhere that already do this automatically, hence the power savings your report (that or you didn’t compare the right things).

I’m not sure if this is easy for you to try (you will have to change a bit some config files at least), but it would be interesting to see power consumption when idling with screen wired to Intel vs Nvidia, and also perfs in both cases.

dvaerum commented 8 years ago

So you don't believe in magic :( My guess is that the graphics card goes into some kind of sleep mod then it is not used, but I don't know. I don't have to much time to play around with it right now, but then school is over I will be happy to do some testing on it for you. If you are interested?

ArchangeGabriel commented 8 years ago

Neh, I’m a scientist. :p All things have an explanation, including the explanations themselves, except maybe at the end of the chain, but that’s too philosophical. ;)

Yes, sure that would be interesting. Looking forward to hearing again from you soon. :)

dvaerum commented 8 years ago

Hey @ArchangeGabriel school is over and I am ready to mess with some config files :smile:

sabian2008 commented 7 years ago

Just wanted to report I might be able to do some tests too. I've just installed bumblebee in Debian Testing and it worked out of the box (with lightdm, gdm3 won't start but don't have time to look into that now). My goal was to test power consumption (and to play with something new on a boring day). How can I monitor power consumption?

I have a NVidia 750Ti and a " VGA compatible controller: Intel Corporation Xeon E3-1200 v3/4th Gen Core Processor Integrated Graphics Controller (rev 06)" Intel Card.

I don't know the potential drawbacks of powering off the card could be (using acpi_call?). Any insights?

Lekensteyn commented 7 years ago

@sabian2008 The power saving methods are not available for desktop, only "Optimus" laptops with Hybrid graphics technology. bbswitch, acpi_call, etc. won't work because the ACPI methods are missing.

If you want to offload graphics rendering, you could have a look at "DRI PRIME", but do not expect huge benefits here. Just plug your monitor in the stronger GPU.

As for power testing, I bought an Energy Logger 4000 (and use this tool for reading the history) for this purpose, but there are others available on the market.

sabian2008 commented 7 years ago

@Lekensteyn Yes, after I posted I noticed the blog post I had superficially read it used acpi_call on desktop, had an error in the introduction and the rest of the article talked about laptops.

Are there any potential drawbacks of using bumblebee without bbswitch on a desktop? I only play demanding games about 6 hours per month and sometimes I use Paraview. In neither case I've noticed appreciable slowdowns running with primusrun. So I guess it seems to work for me and keeps my NVidia card at very low power levels and fan speed (1W and 0% respectively, according to nvidia-sli, I don't have hardware to measure that stuff).

My only concern (as I'm ignorant of the low level stuff) is if it could damage the hardware (intuitively I'd say it's impossible).

Sorry if I posted off-topic here, didn't mean to.

Cheers,

Lekensteyn commented 7 years ago

@sabian2008 Bumblebee can indeed be used without bbswitch, it won't damage your hardware. No problem to ask here, it seems on-topic

rokups commented 7 years ago

I am interested in this as well. Is it in any way possible to run this with nvidia libgl installed? I am aware that bumblebee requires integrated GPU to use mesa libraries, however my main gpu is old nvidia card which freezes with noveau driver. Since i am not aware of bumblebee design could any of developers comment if it is in theory possible to get bumblebee working with nvidia libgl as main libgl? Any pointers what part of code i could poke to make it happen?

Lekensteyn commented 7 years ago

@rokups You have both an integrated and discrete Nvidia GPU, or what is the situation? What is your goal?

rokups commented 7 years ago

I have two discrete GPUs. One is used for passthrough for VM sometimes, one is used for host. Goal is to use bumblebee to start some 3d applications on VM GPU (when it is not in use by VM ofc) as it is more powerful.

ArchangeGabriel commented 7 years ago

@rokups Not sure how much stronger is your second GPU, but Bumblebee has a very high overread (and even more without Primus, VirtualGL being the only likely to work setup for you), so this might a thing…

rokups commented 7 years ago

Well it is gtx630 vs gtx1080 so.. I would still like to try. Why wouldnt primus work?

Also i got package installed and tried to start optirun glxgears. Strange thing happens. Log contains:

[   544.086] (++) Using config file: "/etc/bumblebee/xorg.conf.nvidia"
[   544.086] (**) |   |-->Device "DiscreteNvidia"

And i added BusID "PCI:5:0:0" to xorg.conf.nvidia. It should be using that card. But later in the log:

[   544.087] (II) xfree86: Adding drm device (/dev/dri/card0)
[   544.089] (--) PCI:*(0:3:0:0) 10de:0f00:1458:3544 rev 161, Mem @ 0xde000000/16777216, 0xc8000000/134217728, 0xd0000000/33554432, I/O @ 0x0000e000/128, BIOS @ 0x????????/131072

Now that is my primary GPU. Naturally it fails:

[   544.093] (EE) NVIDIA(GPU-0): Failed to initialize the NVIDIA graphics device!

Any idea whats going on here? Attached full log as well. Xorg.8.log.txt

Lekensteyn commented 7 years ago

@rokups I don't think that Bumblebee is appropriate for your scenario. You presumably still have video outputs on your more powerful GPU. One option is then to plug in your monitor in that GPU. It will be a more pleasant experience than Bumblebee I think. Bumblebee currently relies on unloading modules to unbind the driver from the GPU device. If both your primary and secondary GPUs use the same driver, then you have a problem.

PRIME GPU offloading is another option, but this is not supported by Nvidia blob.

rokups commented 7 years ago

Still does not sound like impossible task. Currently i am using a shellscript that binds driver to the card, starts X, after X quits it binds vfio-pci driver to the card. Nothing that cant be done with some bumblebee patching i guess.

EDIT: Actually it was not hard at all ;)

~/pkg/bumblebee % primusrun glxinfo|grep 1080
OpenGL renderer string: GeForce GTX 1080/PCIe/SSE2

All i needed to patch in bumblebee was swapping of pci_id_igd and pci_bus_id_discrete in main() at bumblebeed.c and make sure nvidia driver is bound to that card. And it works. I should probably work out a proper patch for explicitly configuring which gpu is primary and which isnt. Only quirk i noticed is a crash when application is terminated if it runs through `primusrun:

~/pkg/bumblebee % primusrun glxspheres64
Polygons in scene: 62464 (61 spheres * 1024 polys/spheres)
Visual ID of window: 0x27
Context is Direct
OpenGL Renderer: GeForce GTX 1080/PCIe/SSE2
61.040996 frames/sec - 68.121751 Mpixels/sec
59.957958 frames/sec - 66.913081 Mpixels/sec
60.046534 frames/sec - 67.011932 Mpixels/sec
X Error of failed request:  GLXBadDrawable
  Major opcode of failed request:  154 (GLX)
  Minor opcode of failed request:  11 (X_GLXSwapBuffers)
  Serial number of failed request:  54
  Current serial number in output stream:  55
primus: warning: timeout waiting for display worker
terminate called without an active exception
[1]    10880 abort (core dumped)  primusrun glxspheres64

Not a big deal though.

EDIT2: Why on earth is bumblebee dependent on mesa-libgl? It is not strictly required, bumblebee does not care at all. To me it seems like artificial constraint which helps to do a proper setup on optimus laptops, but not a requirement at all. I have nvidia-libgl installed and bumblebee is buzzing like a charm without any modifications to that LD_LIBRARY_PATH magic because it really does not change anything.

deveee commented 6 years ago

@ArchangeGabriel It's not just for power saving. It would also make a computer more silent and it would make a possibility to easily test applications on different hardware (easy switch between intel and nvidia). And all of them are IMO useful.