MrMEEE / bumblebee-Old-and-abbandoned

OUTDATED!!!!! - Replaced by "The Bumblebee Project" and "Ironhide"
http://www.martin-juhl.dk/2011/08/ironhide-reporting-for-duty/
469 stars 49 forks source link

high cpu usage on glxgears #326

Closed outburstx closed 13 years ago

outburstx commented 13 years ago

I noticed another issue about this, but it was closed.

When I run "optirun64 glxgears", I get about 100% CPU Usage. If I pass anything other than "xv" to the "-c" option of vglrun, I get even higher usage (approaching 200%), and of course higher fps. Running glxgears without optirun64 uses about 8% CPU.

I know Martin commented that he doesn't have this problem, so perhaps you could suggest a way to debug it?

I'm using the newest NVIDIA drivers (275.09.07) on 64-bit Fedora 14 on an Alienware M11x-r2 laptop.

Could it be an issue with the yum-installed version of VirtualGL? Maybe I should build/install it from source? Could it be an issue with the NVIDIA drivers? Should I try older ones?

Does anyone else have this problem?

xeno-by commented 13 years ago

same here: 8-10% cpu usage without optirun, ~90% cpu usage with optirun asus u31jg, ubuntu 11.04 x64, nvidia 270.41.06 never had virtualgl previously (i.e. it got installed with bumblebee), didn't try with older nvidia drivers

outburstx commented 13 years ago

update: I tried building/installing VirtualGL from source, and got the same issue (~90% CPU issue with optirun64 glxgears)

MrMEEE commented 13 years ago

It's natural that the jpeg/proxy gives higher load on the cpu(s), as it encodes every frame into jpeg, instead of sending/processing them in raw format.. This is great if VirtualGL is used in a real Server/Client setup.. but not needed in bumblebee.. this is partly why I recommend xv..

As far as I can see, xv gives same or better performance and also solves problems with a little lag in action-packed games like 1st-person shooters (Urban Terror, Counter strike)..

outburstx commented 13 years ago

MrMEEE: understood, but is the ~90% CPU usage using xv normal? You commented earlier that you do not experience this.

MrMEEE commented 13 years ago

@outburstx

I would not expect you to have 90% usage using xv... I got something like 20-30% on xv..

Maybe the problem here is the configuration of the primary x-server (the intel one)...

if I "misconfigure" the primary x-server, by letting it load the nvidia files, I loose 3d effects in KDE and optirun64 glxgears goes to about 80-85% cpu usage...

xeno-by commented 13 years ago

@MrMEEE

How do I properly configure the x-server? My /etc/X11 folder does not contain any xorg.conf at all. All it has - the xorg.conf.nvidia file that comes from bumblebee. I run Asus U31JG with Intel Pentium P6100 (http://ark.intel.com/Product.aspx?id=50175).

MrMEEE commented 13 years ago

If you have used my version of bumblebee (under ubuntu), then it should already be configured... It's not not configuration in the xorg.conf, but configuration of the libraries being loaded (/etc/alternatives), that is the problem...

xeno-by commented 13 years ago

Well, I did. I cloned this repo a few days ago and ran install.sh. Anything else that I need to do? upd. How do I verify that everything has been configured properly?

MrMEEE commented 13 years ago

Nope... then you should be fine..

xeno-by commented 13 years ago

How do I verify that everything has been configured properly?

Here's my install log (I've uninstalled bumblebee, did git pull and ran install.sh again): http://pastebin.com/K6Lg1pNJ What else should I do? Maybe some symlinks got messed up, and we can fix them?

MrMEEE commented 13 years ago

that should be fine... does glxgears work without optirun???

xeno-by commented 13 years ago

yep, here's the log of running glxgears with and without optirun: http://pastebin.com/gdZcJMKb

xeno-by commented 13 years ago

also, I've noticed a suspicious "link group gl_conf is broken" message during previous uninstallation of bumblebee, but forgot to copy/paste the log. that's why I've uninstalled bumblebee once again to get the log and the exact message: http://pastebin.com/PhzMMBsJ. check the line number 26

xeno-by commented 13 years ago

also, what did you mean by "if I "misconfigure" the primary x-server, by letting it load the nvidia files"? how did you do that? could you, please, explain how this part of bumblebee works?