larskanis / opengl

The official repository of the ruby-opengl wrapper
http://larskanis.github.com/opengl
MIT License
57 stars 11 forks source link

Several fixes #9

Closed archseer closed 11 years ago

archseer commented 11 years ago

We now properly convert to GLboolean, which is what fixes those glGetBooleanv functions that sometimes returned integers. Several other fixes.

archseer commented 11 years ago

Actually I don't believe this is right. I can't find any documentation on interpreting non-boolean (and non-zero) values as true, the glBooleanv should ALWAYS return a GL_TRUE/GL_FALSE.

The only thing the doc mentions is how it gets converted internally: "If glGetBooleanv is called, a floating-point (or integer) value is converted to GL_FALSE if and only if it is 0.0 (or 0). Otherwise, it is converted to GL_TRUE." So we should get a bool returned in any case. I suggest we build a C example for the failed tests and see what gets returned there.

archseer commented 11 years ago

Hmm, this is really weird. I wrote a C program that outputs glBooleanv of GL_DEPTH_WRITEMASK and I made a specially crafted C function for ruby that does the same. What's surprising is that this works in C (I get back true/false), but inside the ruby C function, we do actually get a number?!

static VALUE
gl_getTestr(obj) {
  GLboolean val;
  glGetBooleanv(2930, &val);

  if (val == GL_TRUE) {
    printf("It was true");
  } else if (val == GL_FALSE) {
    printf("It was false");
  } else {
    printf("It was %i", val);
  }
  return GLBOOL2RUBY(val);
...
rb_define_module_function(module, "glTestr", gl_getTestr, 0);
}
p glTestr

and

#include <GL/gl.h>
#include <GL/glu.h>
#include <GL/glut.h>
#include <stdlib.h>
#include <stdio.h>

void init(void)
{
  GLboolean val;

  glDepthMask(GL_TRUE);
  glGetBooleanv(2930, &val);

  if (val == GL_TRUE) {
    printf("It was true");
  } else if (val == GL_FALSE) {
    printf("It was false");
  } else {
    printf("It was %i", val);
  }
}

int main(int argc, char** argv)
{
   glutInit(&argc, argv);
   glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB | GLUT_DEPTH);
   glutInitWindowSize (500, 500);
   glutInitWindowPosition (100, 100);
   glutCreateWindow(argv[0]);
   init();
   return 0;
}

COMPLETELY different results from exactly the same function calls... I think that we may be setting some header defines in common.h or something that changes the behavior so radically...

archseer commented 11 years ago

The problem here goes deeper than that. When calling glDepthMask(GL_TRUE);, ruby actually incorrectly converts GL_TRUE (which is actually true as in TrueClass) to 41, so what gets called is essentially glDepthMask(41);

archseer commented 11 years ago

There, now that part is solved, as well as another fix for to_a conversion that broke some examples calling without an array.

archseer commented 11 years ago

Since you got inactive, this got out of hand a bit, loads more changes...