Open tobim-zz opened 14 years ago
Simple Patch:
static VALUE il_ClearColour(VALUE obj, VALUE rb_red, VALUE rb_green, VALUE rb_blue, VALUE rb_alpha)
{
ILclampf red = (ILubyte)NUM2INT(rb_red);
ILclampf green = (ILubyte)NUM2INT(rb_green);
ILclampf blue = (ILubyte)NUM2INT(rb_blue);
ILclampf alpha = (ILubyte)NUM2INT(rb_alpha);
ilClearColour(red / 255.0, green / 255.0, blue / 255.0, alpha / 255.0);
return Qnil;
}
hi thanks. I think this is an issue with DevIL and its incredibly unstable API. It was, in a previous version, designed to accept bytes but it's moved to floats now?
There is also an issue that the official windows binary seems to be a different version to the linux binary. It's all very confusing and hard to maintain.
I guess i could add some conditional compilation (#ifdef) to the extension so it builds the correct wrapper for the appropriate version, but then the rdocs would be out of whack.
Im currently finishing off some other projects before i come back and take a good look at the state of Devil (which i haven't done in a long time) and sort out some of its issues.
What platform are you running Devil on ?
Im using the libdevil1c2 apt package, aka DevIL 1.7.8-6 under current ubuntu and debian releases.
In 1.7.8the underlying function ilClearColour expects color values as floating point factors (0.0 - 1.0), but the ruby wrapper is handing it bytes.Setting the background color to some basic colors like black/white or a solid red or green actually works, as every color value above zero will be clamped to 1.0. Values in the 0x01-0xfe range will be treated as if 0xff was provided, hence setting wrong colors when used.
In example #120034 produces #FF00FF, orange #e6762b will set the clear color to white #FFFFFF.
I was unable to download older DevIL versions, so don't know if this also applies to earlier versions.Edit: Same applies to DevIL 1.1.1DevIL 1.7.8 - src-IL/src/il_devil.c - Lines 251-268:
rdevil - /ext/devil/ruby_il.c - Lines 362-372: