Open libicocco opened 13 years ago
What distro are you running? What packages have been upgraded (from ... to)? You can use /var/log/apt/history.log
for this information.
I'm running Ubuntu 11.04. I ran:
sudo bumblebee uninstall
and then these are the relevant changes (extracted from apt history):
Start-Date: 2011-07-30 11:01:57
Commandline: apt-get -y --purge remove nvidia-current
Purge: nvidia-current:amd64 (270.41.06-0ubuntu1)
End-Date: 2011-07-30 11:02:27
Start-Date: 2011-07-30 11:03:02
Commandline: apt-get install ppa-purge
Install: aptitude:amd64 (0.6.3-3.2ubuntu1, automatic), libboost-iostreams1.42.0:amd64 (1.42.0-4ubuntu2, automatic), libcwidget3:amd64 (0.5.16-3ubuntu2, automatic), ppa-purge:amd64 (0.2.8+bzr56)
End-Date: 2011-07-30 11:03:06
Start-Date: 2011-07-30 11:04:19
Commandline: apt-get install bumblebee
Install: virtualgl:amd64 (2.2.80-1~nattyubuntu4, automatic), nvidia-current:amd64 (280.04-0~nattyubuntu3, automatic), bumblebee:amd64 (2.2.0-1~nattyubuntu8), acpi-call-dkms:amd64 (240611-1~natty, automatic)
End-Date: 2011-07-30 11:07:08
Start-Date: 2011-07-30 12:03:04
Commandline: apt-get upgrade
Upgrade: nvidia-settings:amd64 (270.29-0ubuntu1, 275.19-0~nattyubuntu0)
End-Date: 2011-07-30 12:03:11
The problem of compiling is solved by downgrading the nvidia-current package from the one provided by bumblebee ppa to the official one:
Start-Date: 2011-07-31 11:05:29
Commandline: synaptic
Downgrade: nvidia-current:amd64 (280.04-0~nattyubuntu4, 270.41.06-0ubuntu1)
End-Date: 2011-07-31 11:06:35
So it should be a bug in the nvidia driver provided by bumblebee, I guess. Apart from this I have to run
sudo /etc/init.d/bumblebee stop
each time I want to run something with optirun. Running that starts the server:
* Starting Bumblebee X server bumblebee
/usr/local/bin/bumblebee-enablecard: 1: �: not found
[ OK ]
and after running the command with optirun I get the following:
* Stopping Bumblebee X server bumblebee
/usr/local/bin/bumblebee-disablecard: 1: �: not found
[ OK ]
/usr/local/bin/bumblebee-disablecard: 1: �: not found
Then I have to stop bumblebee if I wan to "optirun" something again. I tried setting up the disablecard/enablecard scripts, but loading acpi_call seems to prevent nvidia module loading properly. Plus I didn't set up those scripts before (before the ppa) and I didn't have these problems.
Thanks for the help!
/etc/init.d/bumblebee stop
is the same as /etc/init.d/bumblebee enable
(not sure if this was intended).
That questionmark is in hex code AB
. I don't know where that byte came from, although I'm affected too. Does lspci | grep VGA
show your nvidia card?
I guess it was intended since in lines 161 and 171 in the file (/etc/init.d/bumblebee stop) they commented about the "weird logic":
# weird logic: enable the bumblebee server when the system is shut down?
# weird logic: disable the bumblebee server when the system is started?
The hex code was there in the files /usr/local/bin/bumblebee-disablecard and /usr/local/bin/bumblebee-enablecard from the ppa. I read users are supposed to test which acpi command enables and disables the card, and change those files with the modified templates provided. However, as I mentioned, acpi_call interferes somehow in the load of the nvidia module. The nvidia card is shown in lscpi:
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)
01:00.0 VGA compatible controller: nVidia Corporation Device 0df4 (rev a1)
@libicocco: that was my comment as I'm not sure if it was intended or not. By "acpi_call interferes with the load of nvidia module", do you mean that you get an error in your kern.log about an unsupported card?
Hi I work on OpenGL programs 4 too. Does one of you has already run an OpenGL 4 program with bumblebee? Because I don't find how create a valid context with it..
@Lekensteyn: when I loaded the acpi_call module and rebooted there was an error in /var/log/Xorg.8.log about the nvidia card (which doesn't happen if acpi_call is not loaded). Unfortunately I cannot reproduce it now because acpi_call module doesn't exist anymore (since bumblebee update, maybe?). @hiairrassary: what is your problem exactly? My small application compiles and executes correctly with the nvidia driver 270.
@libicocco would you mind to include your Xorg.8.log file? (which could now be Xorg.8.log.old) and the relevant error messages (kern.log?)
Xorg.8.log can be found in http://pastebin.com/H9hcSHDY and a cropped version of kern.log.1 in http://pastebin.com/LFA46qFB
My problem is that GLX implementation doesn't expose GLX_ARB_create_context extension, so I cannot create an OpenGL 4 context, like this tutorial on OpenGL wiki :
The program create an OpenGL 4 context. When you compile and run this program what is your output with bumblebee?
Thanks for your answer!!
@hiairrassary This is the output I obtained:
Getting matching framebuffer configs
Found 9 matching FB configs.
Getting XVisualInfos
Matching fbconfig 0, visual ID 0x21: SAMPLE_BUFFERS = 0, SAMPLES = 0
Matching fbconfig 1, visual ID 0x21: SAMPLE_BUFFERS = 0, SAMPLES = 0
Matching fbconfig 2, visual ID 0x21: SAMPLE_BUFFERS = 0, SAMPLES = 0
Matching fbconfig 3, visual ID 0x21: SAMPLE_BUFFERS = 1, SAMPLES = 2
Matching fbconfig 4, visual ID 0x21: SAMPLE_BUFFERS = 1, SAMPLES = 2
Matching fbconfig 5, visual ID 0x21: SAMPLE_BUFFERS = 1, SAMPLES = 2
Matching fbconfig 6, visual ID 0x21: SAMPLE_BUFFERS = 1, SAMPLES = 4
Matching fbconfig 7, visual ID 0x21: SAMPLE_BUFFERS = 1, SAMPLES = 4
Matching fbconfig 8, visual ID 0x21: SAMPLE_BUFFERS = 1, SAMPLES = 4
Chosen visual ID = 0x21
Creating colormap
Creating window
Mapping window
glXCreateContextAttribsARB() not found ... using old-style GLX context
Direct GLX rendering context obtained
Making context current
It's the same for me. And you can see you don't have glXCreateContextAttribsARB() function, so not GLX_ARB_create_context extension, and then it's impossible to use OpenGL 4! The max is OpenGL 2.1!
I have a laptop with AMD HD 5xxx and I haven't got this output, but "Created GL 3.0 context".
I think bumblebee use VirtualGL, that use their GLX implementation (different from OpenGL) and not NVIDIA Corporation implementation, while for OpenGL yes.
P.S. If GLX_ARB_create_context is absent, it's impossible to use OpenGL >= 3.0!!
But I think Bumblebee is a very good project for Nvidia owners :)
Issue #481 has been marked as duplicate
VirtualGL is responsible for 3D support
I asked the developers of VirtualGL and they answered that they are going to see they can do!
So, wait and see :)
Great, thank you very much!
The new VirtualGL version released today added support for GLX_ARB_create_context !
A big Woot to the devs !
Oh my god! =)
And do you know how use this version? With your PPA?
The developers of VirtualGL gave me a beta version. So I test and I tell you!
As you read in https://sourceforge.net/tracker/?func=detail&atid=678330&aid=3386112&group_id=117509, I couldn't get the source compiled, so our PPA won't contain this new build.
I have created successfully an OpenGL 4.1 context with the pre-release! =)
The source compiled is available here "http://www.virtualgl.org/DeveloperInfo/PreReleases". You have package for debian like (.deb), and for RedHat/SuSE/Fedora (.rpm). I have downloaded .deb, and I have successfully installed it with dpkg -i VirtualGL_2.2.90_amd64.deb.
After I have created a link with ln -s with libGL in /usr/lib/nvidia-current and it's all.
It's fantastic for developpers!
We know that there are those DEBs and RPMs, but we can't use them for Bumblebee.
We will stay on 2.2.80 for now, and try to manage how to compile 2.2.90.
Ok ok. But I don't understand the problem? What is rrfakerut?
And why VirtualGL developper can compile it without any problems?
We're not able to understand clearly the problem ourselves. Else, we may have solved it.
And about your last questions, we were exactly asking it to ourselves, and we have now asked them (mostly him in fact, as it looks that dcommander is alone).
@hiarrassary VirtualGL loads its own GL library into applications. rrfakerut looks like the core of this magic, it implements the GL stuff. The developer of VirtualGL is probably running a very, very recent version of Mesa.
Its own GL library? I thought VirtualGL use Nvidia implementation.
But if you have asked dcommander, it's a good way to integrate the new VirtualGL version directly in your PPA!
Tomorrow I'll try with bigger OpenGL 4.1 program :) But if it run, we must find a solution for all OpenGL developpers, because tesselation shaders are a very powerful and cool feature =)
And the specification of OpenGL 4.2 was released one week ago! Do you know if you will use newest Nvidia drivers shortly?
I misphrased it, it loads its own library in it which takes care of transfering the frames afaik.
I'm already using nvidias latest drivers (280.13), but even those won't let me compile VirtualGL.
Those are not the latest Lekensteyn.
nVidia already released a new version adding support for OpenGL 4.2.
But it's targeted at developper mainly.
I'm not sure OpenGL 4.2 Nvidia drivers are available on Linux... To be confirmed.
Can you write your errors when you compile VirtualGL (same you have written on SourceForge?).
Yes, it is the same that Lekenstyen has written on SourceForge. However, it looks like dcommander has already fixed it.
nVidia OpenGL 4.2 driver for Linux : http://developer.nvidia.com/opengl-driver (at the bottom of the page).
And as you can read it here : http://developer.nvidia.com/content/nvidia-opengl-42-drivers-windows-and-linux-now-available, they're available since the 9th.
It's a good news! Do you upgrade your Nvidia drivers in the PPA of Bumblebee? (moreover they fix some bugs).
Tonight, I will try to compile VirtualGL 2.2.90 manually. And I just have seen that dcommander have released few hours ago the 2.2.90 version, so not a pre-release; he has written : "2.3 beta1 (just released) should fix the build issue you were having."
We were able to compile, so we may provide this version.
About nVidia drivers, we will no longer provide them in the Bumblebee PPA, as they are available in distro repo, and I'm explaining why juste below.
In fact, if you want new drivers, you have three possibilities : 1) Use x-swat updates repo will provide you the latest stable release 2) Use xorg-edgers testing repo will provide you the latest unstable release 3) Use bumblebee repo will provide you a strange version
If someone uses the xorg-edgers repo for anything, then he must use nvidia drivers from this repo, because they're the only which are working with multiarch.
So we have problem with trying to provides drivers for everyone.
What we're going to do is adding two (or 4) new repos to the Bumblebee project for newer drivers : they will provide the latest stable/unstable drivers for simple arch/multi arch systems for people who want advanced drivers.
dcommander did a good work with VirtualGL, and you too, if you provide an access to the last OpenGL versions!
And add new repos with differents drivers can help to fix somes bugs! Keep us informed!
Currently the packaged version is on ppa:bumblebee/stable, but it'll have to be rebuild because the /usr/bin/glxinfo
file conflicts with the one provided by mesa. The next version, 2.2.90-1~ppanatty2, should fix it.
Currently I'm also trying to get OpenGL 4 to work on my optimus Notebook. I also installed the new version of VIrtualGL and created the softlink to the nvidia library like hiairrassary supposed above. I can manage to create a context with the tutorial available here: http://www.opengl.org/wiki/Tutorial:_OpenGL_3.0_Context_Creation_%28GLX%29
I thought I would be done now. Yet, when trying to compile a OpenGL 4 program (the only one I found is this: http://prideout.net/blog/p48/TriangleTess.zip). I get the error-message: "Can't create a framebuffer for OpenGL 4.0" and I don't know what to do know. I'm running bumblebee from the repository and use the nvidia-drivers that are installed by bumblebee while installing (that should be fine). Any suggestions on what to do next?
Your sample some externals library. And do you use a recent version of Glew because all versions doesn't support OpenGL 3.x - 4.x core profile.
See that :
Hello!
I tell you about a new problem with GLX_ARB_create_context : when I debug my program, if the OpenGL context has created in compatibility profile no problem, else, in core profile, after the first glXSwapBuffer() all OpenGL call thrown an error (caught with glGetError).
I don't know if the problem is on Nvidia or VirtualGL.
Below in the file attachment you have a basic program. Compile it with g++ main.cpp -o m -lX11 -lGL.
If you uncomment the line :
GLX_CONTEXT_PROFILE_MASK_ARB , GLX_CONTEXT_COMPATIBILITY_PROFILE_BIT_ARB,
You have an error!
P.S. I don't know how to hide the code on GitHub...
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <unistd.h>
#include <X11/Xlib.h>
#include <X11/Xutil.h>
#include <GL/gl.h>
#include <GL/glx.h>
#include <iostream>
typedef GLXContext (*glXCreateContextAttribsARBProc)(Display*, GLXFBConfig, GLXContext, Bool, const int*);
int main (int argc, char ** argv)
{
Display *display = XOpenDisplay(0);
if (!display)
{
printf( "Failed to open X display\n" );
exit(1);
}
static int visual_attribs[] =
{
GLX_X_RENDERABLE , True,
GLX_DRAWABLE_TYPE , GLX_WINDOW_BIT,
GLX_RENDER_TYPE , GLX_RGBA_BIT,
GLX_X_VISUAL_TYPE , GLX_TRUE_COLOR,
GLX_RED_SIZE , 8,
GLX_GREEN_SIZE , 8,
GLX_BLUE_SIZE , 8,
GLX_ALPHA_SIZE , 8,
GLX_DEPTH_SIZE , 24,
GLX_STENCIL_SIZE , 8,
GLX_DOUBLEBUFFER , True,
None
};
int glx_major, glx_minor;
// FBConfigs were added in GLX version 1.3.
if ( !glXQueryVersion( display, &glx_major, &glx_minor ) || ( ( glx_major == 1 ) && ( glx_minor < 3 ) ) || ( glx_major < 1 ) )
{
printf( "Invalid GLX version" );
exit(1);
}
printf( "Getting matching framebuffer configs\n" );
int fbcount;
GLXFBConfig *fbc = glXChooseFBConfig( display, DefaultScreen( display ),
visual_attribs, &fbcount );
if ( !fbc || fbcount < 1)
{
printf( "Failed to retrieve a framebuffer config\n" );
exit(1);
}
GLXFBConfig bestFbc = fbc[0];
// Get a visual
XVisualInfo *vi = glXGetVisualFromFBConfig( display, bestFbc );
XSetWindowAttributes swa;
Colormap cmap;
swa.colormap = cmap = XCreateColormap( display,RootWindow( display, vi->screen ), vi->visual, AllocNone );
swa.background_pixmap = None ;
swa.border_pixel = 0;
swa.event_mask = StructureNotifyMask;
printf( "Creating window\n" );
Window win = XCreateWindow(display,
RootWindow( display, vi->screen ),
0, 0, 100, 100, 0, vi->depth,
InputOutput,
vi->visual,
CWBorderPixel|CWColormap|CWEventMask,
&swa );
if (!win)
{
printf( "Failed to create window.\n" );
exit(1);
}
// Done with the visual info data
XFree( vi );
XStoreName( display, win, "OpenGL Window" );
XMapWindow( display, win );
glXCreateContextAttribsARBProc glXCreateContextAttribsARB = 0;
glXCreateContextAttribsARB = (glXCreateContextAttribsARBProc)
glXGetProcAddressARB( (const GLubyte *) "glXCreateContextAttribsARB" );
GLXContext ctx = 0;
if (!glXCreateContextAttribsARB)
{
printf( "glXCreateContextAttribsARB() not found\n" );
exit(1);
}
else
{
int context_attribs[] =
{
GLX_CONTEXT_MAJOR_VERSION_ARB, 3,
GLX_CONTEXT_MINOR_VERSION_ARB, 3,
//GLX_CONTEXT_PROFILE_MASK_ARB , GLX_CONTEXT_COMPATIBILITY_PROFILE_BIT_ARB,
None
};
printf( "Creating context\n" );
ctx = glXCreateContextAttribsARB( display, bestFbc, 0, True, context_attribs );
XSync( display, False );
if (!ctx)
{
printf( "Failed to create GL 3.3 context\n" );
exit(1);
}
}
if ( !glXIsDirect ( display, ctx ) )
printf( "Indirect GLX rendering context obtained\n" );
else
printf( "Direct GLX rendering context obtained\n" );
printf( "Making context current\n" );
glXMakeCurrent( display, win, ctx );
const GLubyte* oglVersion = glGetString(GL_VERSION);
std::cout << "OpenGL version : " << oglVersion << std::endl;
GLenum lastError = glGetError();
// Frame 1 :
glClearColor ( 0, 0.5, 1, 1 );
glClear ( GL_COLOR_BUFFER_BIT );
if( glGetError() != GL_NO_ERROR)
std::cout << "Error OpenGL frame 1" << std::endl;
glXSwapBuffers ( display, win );
sleep( 1 );
// Frame 2 :
glClearColor ( 1, 0.5, 0, 1 );
if( glGetError() != GL_NO_ERROR)
std::cout << "Error OpenGL frame 2" << std::endl;
glClear ( GL_COLOR_BUFFER_BIT );
glXSwapBuffers ( display, win );
sleep( 1 );
glXMakeCurrent( display, 0, 0 );
glXDestroyContext( display, ctx );
XDestroyWindow( display, win );
XFreeColormap( display, cmap );
XCloseDisplay( display );
}
Thanks!
Please try the new version here : https://github.com/Bumblebee-Project/Bumblebee.
By the way, you should run glxinfo to check whether the GLX extension you're using are supported or not.
I will try your example as soon as possible.
I'm not having any errors with your example...
Can you explain me why there are two Bumblebee project?^^
Ok, I'll re-install all Ubuntu with the last Bumblebee and VirtualGL!
There has benn a big change in the end of July, we re-started the project from scrath at the mentionned URI.
That is in order to have a stable version, and not a most uptodate as possible one.
If you look at the git repo, you will see that we have now a very precise developing process, that gives far much better result.
The MrMEEE version will be removed soon.
To switch to the new repository, do the following : sudo apt-get install ppa-purge sudo ppa-purge ppa:mj-casalogic/bumblebee wget https://raw.github.com/Bumblebee-Project/Bumblebee/master/cleanup chmod +x cleanup sudo ./cleanup
sudo add-apt-repository ppa:bumblebee/stable sudo apt-get update sudo apt-get install bumblebee
sudo usermod -a -G bumblebee YOURUSERNAME
(Replace YOURUSERNAME by your username ;)
Reboot
Try to run glxspheres, for around 30s, close the window, note the result.
Then optirun it, and compare.
I upgraded to Bumblebee-Project as explained above and the code previously stated runs and shows a blue (then orange) window... until a weird error happens:
Getting matching framebuffer configs
Creating window
Creating context
Direct GLX rendering context obtained
Making context current
OpenGL version : 3.3.0 NVIDIA 270.41.06
Error OpenGL frame 2
PS.- By the way, glxspheres report 40MP/s, optirun glxspheres report ~115MP/s with a nVidia 540M.
@libicocco : you should use newer driver... What distro are you running ?
@ArchangeGabriel : Ubuntu 11.04, and I'm using the version 270.41.06
Strange... 275.19 should be available in the normal distro repo.
I suggest you to add the latest stable driver to your system : sudo add-apt-repository ppa:ubuntu-x-swat/x-updates sudo apt-get update sudo apt-get upgrade nvidia-current
Following your instructions, I updated to the driver 280.13. I still get the same error though
Getting matching framebuffer configs
Creating window
Creating context
Direct GLX rendering context obtained
Making context current
OpenGL version : 3.3.0 NVIDIA 280.13
Error OpenGL frame 2
And the performance of glxspheres is rather variant: it started reporting ~170 MP/s stable for half a minute or so, then went down to ~80 MP/s and now it's around ~110 MP/s. In the mean time only google-chrome was loading (not run with optirun, and stopped after the performance went down, which didn't improve the results)
Ok, dunno what to say for the error, maybe hiairrassary would be able to tell us more about it.
About glxspheres, it is a little becnhmarking tool that need improvements, and we need to change some things in Bumblebee to allow you to run it really as a benchmarking tool (VirtualGL should theorically be limiting performance in order to avoid useless rendering).
When this repo didn't used to be a joke.
Hi,
I just updated bumblebee to the ppa version. glxgears and glxinfo run after stopping /etc/init.d/bumblebee. However, when I try to compile (optirun make) an application which depends on OpenGL 4 I get the following assertion failed:
Even when I try cmake (optirun cmake) I get the following:
The same commands ran properly with git bumblebee from the beginning of June. Any ideas?