Closed LukasKalbertodt closed 2 years ago
Can you call this function and see if you actually got an sRGB context on linux? https://docs.rs/glutin/0.22.0-alpha1/glutin/struct.ContextWrapper.html#method.get_pixel_format
Can you call this function to check if you got an EGL context on linux? https://docs.rs/glutin/0.22.0-alpha1/glutin/struct.Context.html#method.get_egl_display
Currently sRGB support should work with GLX, but not EGL. No clue about WGL. on MacOS I have a burning suspicion it is broken (https://github.com/rust-windowing/glutin/issues/1160).
Executing the following immediately after creating the display
:
use crate::glium::glutin::os::ContextTraitExt;
println!("{:#?}", display.gl_window().get_pixel_format());
println!("{:?}", unsafe { display.gl_window().get_egl_display() });
... with with_srgb(false)
results in:
PixelFormat {
hardware_accelerated: true,
color_bits: 24,
alpha_bits: 8,
depth_bits: 24,
stencil_bits: 8,
stereoscopy: false,
double_buffer: true,
multisampling: None,
srgb: false,
}
None
With with_srgb(true)
the field srgb
is true
. So judging from this, it seems to work! However, that still doesn't explain why I get the wrong color (in other words: why the color conversion is performed).
EDIT: I guess it would be useful to see the output of glGetFramebufferAttachmentParameter(GL_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING))
. I will try to get that information later!
Right, hmmm. On linux, can you check if it thinks it's srgb? https://community.khronos.org/t/check-color-encoding-in-the-default-framebuffer-draw-buffer-for-srgb/72854
Also, can you provide some hardware info:
Linux: Just run glxinfo && glinfo && lspci
Thanks for answering so fast! :)
Right, hmmm. On linux, can you check if it thinks it's srgb? https://community.khronos.org/t/check-color-encoding-in-the-default-framebuffer-draw-buffer-for-srgb/72854
I never executed raw OpenGL calls when using glium
. Could you give me a quick pointer how I would do that? I can also figure it out myself tomorrow, that would not be a problem, but maybe you can speed this up by giving me some pointers ^_^
Also, can you provide some hardware info: Linux: Just run
glxinfo && glinfo && lspci
Did you mean eglinfo
?
I've never used glium, sorry.
So ok, I ran this code using glutin
and the gl
crate.
let events_loop = glutin::EventsLoop::new();
let wb = glutin::WindowBuilder::new();
let gl_window = glutin::ContextBuilder::new()
.with_srgb(false) // <------------------------------------------
.build_windowed(wb, &events_loop)
.unwrap();
let gl_window = unsafe { gl_window.make_current().unwrap() };
gl::load_with(|symbol| gl_window.get_proc_address(symbol) as *const _);
let mut out: gl::types::GLint = 0;
unsafe {
gl::GetFramebufferAttachmentParameteriv(
gl::DRAW_FRAMEBUFFER,
gl::BACK,
gl::FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING,
&mut out as *mut _,
);
}
println!("out: {}", out);
if out == gl::LINEAR as i32 {
println!(" linear");
} else if out == gl::SRGB as i32 {
println!(" srgb");
}
It always prints, regardless of the value in with_srgb(_)
:
out: 35904
srgb
I only checked this on Ubuntu now, but I might check it on windows later. In any case, this seems like a bug somewhere, right? The framebuffer is sRGB but it shouldn't. glutin
incorrectly thinks the pixel format is not sRGB.
Any idea what could be wrong here?
EDIT: I tested it on Windows and it get's stranger. Regardless of the with_srgb
parameter, I get the following output:
out: 9729
linear
PixelFormat {
hardware_accelerated: true,
color_bits: 24,
alpha_bits: 8,
depth_bits: 24,
stencil_bits: 8,
stereoscopy: false,
double_buffer: true,
multisampling: None,
srgb: true,
}
The shown color is #bdbdbd
.
I uploaded my test program now: https://github.com/LukasKalbertodt/srgb-test
So on Windows (with WGL) what might be happening is that the function choose_arb_pixel_format
returns true
for sRGB (since WGL_EXT_framebuffer_sRGB
is available on my machine; maybe it's the same on your machine, or maybe you have WGL_ARB_framebuffer_sRGB
), and then returns that value in the PixelFormat
struct. Finally, this struct is stored in the pixel_format
field of the Context
struct.
From there, any call to get_pixel_format()
simply clones this field and returns it, and that's how you end up seeing weird behavior where OpenGL
claims to be seeing linear
but glutin
thinks sRGB
is being used.
@ZeGentzy Should choose_arb_pixel_format
take into consideration the pixel format that the user has specified?
@sumit0190
choose_arb_pixel_format
will only say that it is sRGB if the call to GetPixelFormatAttribivARB
says it is. At this point, the format has already been chosen earlier and we are just returning it's properties.
If the claim is that the format is not actually sRGB but GetPixelFormatAttribivARB
is saying it is, then this is a winapi bug, and should be filed with Microsoft (good luck on that), although I highly doubt this is the case.
On windows, the format is decided by calling GetPixelFormat
to get the format of the window.
If a format is found after calling that function, we use that format, as we cannot change it, see the remarks for SetPixelFormat
:
[...] Setting the pixel format of a window more than once can lead to significant complications for the Window Manager and for multithread applications, so it is not allowed. An application can only set the pixel format of a window one time. Once a window's pixel format is set, it cannot be changed.
-- https://docs.microsoft.com/en-us/windows/desktop/api/wingdi/nf-wingdi-setpixelformat
If not found (as in previously unset), we either call choose_arb_pixel_format_id
or choose_native_pixel_format_id
(depending on the presence of WGL_ARB_pixel_format
) and set the format of the window to that.
choose_native_pixel_format_id
simply errors implying this is not the function in use by your computer: https://github.com/rust-windowing/glutin/blob/master/glutin/src/api/wgl/mod.rs#L495
choose_arb_pixel_format_id
only requests an sRGB context if the user requested it: https://github.com/rust-windowing/glutin/blob/master/glutin/src/api/wgl/mod.rs#L704
Looking over the code I see one potential issues which should be investigated by someone in procession of a windows computer:
It might be possible that the lack of a FRAMEBUFFER_SRGB_CAPABLE_{EXT,ARB}
here is treated as a don't care. This is unlikely, as that would break all existing opengl programs. A simple test would be to add FRAMEBUFFER_SRGB_CAPABLE_{EXT,ARB}
followed by 0
if sRGB is not requested.
So ok, I ran this code using
glutin
and thegl
crate.let events_loop = glutin::EventsLoop::new(); let wb = glutin::WindowBuilder::new(); let gl_window = glutin::ContextBuilder::new() .with_srgb(false) // <------------------------------------------ .build_windowed(wb, &events_loop) .unwrap(); let gl_window = unsafe { gl_window.make_current().unwrap() }; gl::load_with(|symbol| gl_window.get_proc_address(symbol) as *const _); let mut out: gl::types::GLint = 0; unsafe { gl::GetFramebufferAttachmentParameteriv( gl::DRAW_FRAMEBUFFER, gl::BACK, gl::FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING, &mut out as *mut _, ); } println!("out: {}", out); if out == gl::LINEAR as i32 { println!(" linear"); } else if out == gl::SRGB as i32 { println!(" srgb"); }
Numerous things:
1- Afaik, GL_BACK
is not a valid attachment. If you check for an error after that call, I suspect you'll be receiving GL_INVALID_ENUM
. Then again, since stereoscopy is not in use, maybe it is treating it as an alias for GL_BACK_LEFT
.
2- You are not enabling GL_FRAMEBUFFER_SRGB
when intending to use sRGB. From the ARB_framebuffer_sRGB
extension:
18) Why is the sRGB framebuffer GL_FRAMEBUFFER_SRGB enable disabled by default?
RESOLVED: This extension could have a boolean sRGB-versus-non-sRGB pixel format configuration mode that determined whether or not sRGB framebuffer update and blending occurs. The problem with this approach is 1) it creates may more pixel formation configurations because sRGB and non-sRGB versions of lots of existing configurations must be advertised, and 2) applicaitons unaware of sRGB might unknowingly select an sRGB configuration and then generate over-bright rendering. It seems more appropriate to have a capability for sRGB framebuffer update and blending that is disabled by default. This allows existing RGB8 and RGBA8 framebuffer configurations to be marked as sRGB capable (so no additional configurations need be enumerated). Applications that desire sRGB rendering should identify an sRGB-capable framebuffer configuration and __then enable sRGB rendering.__
1- Afaik,
GL_BACK
is not a valid attachment. If you check for an error after that call, I suspect you'll be receivingGL_INVALID_ENUM
. Then again, since stereoscopy is not in use, maybe it is treating it as an alias forGL_BACK_LEFT
.
Good point. I changed it but it didn't change anything on Ubuntu or Windows. I pushed the change to the repo I linked above. I also checked glGetError
with just GL_BACK
and it returned 0
so apparently it was fine.
2- You are not enabling
GL_FRAMEBUFFER_SRGB
when intending to use sRGB. From theARB_framebuffer_sRGB
extension:
I'm not quite sure what you want to say by that, sorry! This tiny program I posted is just to test what various sources report about the framebuffer.
And could someone clarify in what state this issue is? Like:
with_srgb(false)
and a standard shader that returns 0.5
should result in an image that is #808080
and not #bbbbbb
? This behavior is unintended, right? It's not my fault, right?glutin
codebase, but can help somehow?There's definitely something funky going on here, although my cursory knowledge of OpenGL/glutin may not be enough to figure it out and @ZeGentzy might have to help me out in the end.
Interestingly, at least on my machine, changing with_srgb
to true
doesn't change the pixel_format_id
at all; plus, GetFramebufferAttachmentParameteriv
still returns linear
(I made sure to enable GL_FRAMEBUFFER_SRGB
before trying this out).
@sumit0190 Can you check that 1
is being pushed by one of these: https://github.com/rust-windowing/glutin/blob/master/glutin/src/api/wgl/mod.rs#L704 ?
@ZeGentzy So here's some more interesting stuff.
with_srgb
set to false
returns this descriptor:
[8193, 1, 8208, 1, 8211, 8235, 8195, 8231, 8212, 24, 8219, 8, 8226, 24, 8227, 8, 8209, 1, 8210, 0, 0]
with_srgb
set to true
returns this descriptor:
[8193, 1, 8208, 1, 8211, 8235, 8195, 8231, 8212, 24, 8219, 8, 8226, 24, 8227, 8, 8209, 1, 8210, 0, 8361, 1, 0]
So the with_srgb
parameter is doing it's job and adding the right attribute (8361
). But the pixel_format_id
returned in both cases is 10
, which is weird, right?
Then I decide to use my own dummy descriptor, with some random values for all attributes. This is what that looks like:
let descriptor =
[
gl::wgl_extra::DRAW_TO_WINDOW_ARB as i32, 1 as i32,
gl::wgl_extra::SUPPORT_OPENGL_ARB as i32, 1 as i32,
gl::wgl_extra::DOUBLE_BUFFER_ARB as i32, 1 as i32,
gl::wgl_extra::PIXEL_TYPE_ARB as i32, gl::wgl_extra::TYPE_RGBA_ARB as i32,
gl::wgl_extra::COLOR_BITS_ARB as i32, 32 as i32,
gl::wgl_extra::DEPTH_BITS_ARB as i32, 24 as i32,
gl::wgl_extra::STENCIL_BITS_ARB as i32, 8 as i32,
// gl::wgl_extra::FRAMEBUFFER_SRGB_CAPABLE_EXT as i32, pf_reqs.srgb as i32,
0 as i32, // End
];
Still the same result though; a format of 9
is returned, and glutin
still thinks srgb
is set.
But if I uncomment the commented line, suddenly things change: now, a format of 9
is returned when with_srgb
is set to true
, but when set to false
, a format of 105
is returned, and glutin
behaves correctly.
Now that makes me think: it looks like FRAMEBUFFER_SRGB_CAPABLE_EXT
should be always defined, and just as you recommended, it should be set to 0
or 1
depending on whether sRGB is requested.
So I make these modifications:
// Find out if sRGB is needed and explicitly set its value to 0 or 1.
if extensions
.split(' ')
.find(|&i| i == "WGL_ARB_framebuffer_sRGB")
.is_some()
{
out.push(
gl::wgl_extra::FRAMEBUFFER_SRGB_CAPABLE_ARB as raw::c_int,
);
} else if extensions
.split(' ')
.find(|&i| i == "WGL_EXT_framebuffer_sRGB")
.is_some()
{
out.push(
gl::wgl_extra::FRAMEBUFFER_SRGB_CAPABLE_EXT as raw::c_int,
);
} else {
return Err(());
}
out.push(pf_reqs.srgb as raw::c_int);
And this should do the same thing as my dummy descriptor. But even though I can see the attribute (8361
) added with a 0
or 1
depending on my with_srgb
, it gives me the same pixel_format_id
regardless (10
). Needs some more investigation, I guess.
Please submit a PR with this code instead:
// Find out if an sRGB extension is present then explicitly set its value to 0 or 1.
if extensions
.split(' ')
.find(|&i| i == "WGL_ARB_framebuffer_sRGB")
.is_some()
{
out.push(
gl::wgl_extra::FRAMEBUFFER_SRGB_CAPABLE_ARB as raw::c_int,
);
out.push(pf_reqs.srgb as raw::c_int);
} else if extensions
.split(' ')
.find(|&i| i == "WGL_EXT_framebuffer_sRGB")
.is_some()
{
out.push(
gl::wgl_extra::FRAMEBUFFER_SRGB_CAPABLE_EXT as raw::c_int,
);
out.push(pf_reqs.srgb as raw::c_int);
} else {
if pf_reqs.srgb {
return Err(());
}
}
as we should only error if srgb was requested and there was no srgb extension.
Nvm, didn't read hard enough, you said it still doesn't work in the end... hmmmm.
Can you comment out the code specifying things like alpha, stencil bits, ect, ect, then add them back one by one. The goal is to discover if one of the other options is conflicting/overriding our what we set FRAMEBUFFER_SRGB_CAPABLE_*
to.
Nvm, didn't read hard enough, you said it still doesn't work in the end... hmmmm.
We'd still need to modify it anyway, since if it's left unspecified it's somehow assumed to be true (this is contrary to the documented behavior; I should try this on Linux sometime).
Anyway, I was doing exactly what you suggested and narrowed down the culprit to ALPHA_BITS_ARB
.
So just to be clear: if ALPHA_BITS_ARB
is specified in the descriptor, then FRAMEBUFFER_SRGB_CAPABLE_*
is somehow not taken into consideration by ChoosePixelFormat
(i.e., a pixel format that has SRGB enabled is returned). This is why my dummy descriptor works (if I always specify FRAMEBUFFER_SRGB_CAPABLE_*
), but the same logic doesn't work in the existing method since it also needs to specify ALPHA_BITS_ARB
.
Can you run this program and check the available pixel formats: http://realtech-vr.com/admin/glview
This just strikes me as shitty intel drivers.
Available pixel formats seem right to me. And I did my investigation on an Nvidia Quadro with recent-ish drivers (although theirs have been known to be shitty as well, so you never know).
Also, I just verified that setting ALPHA_BITS_ARB
to 0
(instead of the default 8
) seems to also give the right results (i.e., sRGB is false
).
@ZeGentzy Here's an interesting link that chronicles similar oddities: https://devtalk.nvidia.com/default/topic/776591/opengl/gl_framebuffer_srgb-functions-incorrectly/
Note that while the thread itself is from 2014, the last post (by which time the issue hadn't been resolved yet) is from a year ago.
Interestingly, on my Linux VM, get_pixel_format
returns the right value that corresponds to with_srgb
, but OpenGL always returns sRGB. It might be worth a try to try all these experiments (especially the one on Windows) using FBOs.
@sumit0190 Is there a format with non-zero alpha bits which is also not-srgb?
@ZeGentzy Not that I can find. glview
doesn't list out alpha-bits, and visualinfo
(from glew
) doesn't list out SRGB_EXT
. I was able to combine information from the two for my experiments though.
Anyway, from what I can observe, this behavior is visible outside of glutin
too - a simple C++ program using wgl
that uses wglChoosePixelFormat
also displays the same behavior.
I created this table to better document this weird behavior. Note that while this experiment was done with the C++ version, glutin
also agrees with it. In my test program, I try to render a window with glClearColor(0.5f, 0.5f, 0.5f, 1.0f)
. Sometimes, the result would be with #808080
, which is expected, but often it would show up with #bbbbbb
. I have also included certain properties of my pixel formats for easier reference.
Without further ado:
Default attributes:
WGL_DRAW_TO_WINDOW_ARB, TRUE,
WGL_ACCELERATION_ARB, WGL_FULL_ACCELERATION_ARB,
WGL_SUPPORT_OPENGL_ARB, TRUE,
WGL_DOUBLE_BUFFER_ARB, TRUE,
WGL_PIXEL_TYPE_ARB, WGL_TYPE_RGBA_ARB,
WGL_COLOR_BITS_ARB, 24,
WGL_DEPTH_BITS_ARB, 24,
WGL_STENCIL_BITS_ARB, 0,
Pixel formats:
Pixel Format | Alpha Bits | WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT |
---|---|---|
7 | - | 1 |
8 | 8 | 1 |
103 | - | 0 |
Observations:
WGL_ALPHA_BITS_ARB listed? |
WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT listed? |
glEnable(GL_FRAMEBUFFER_SRGB) ? |
Pixel format selected | Output color |
---|---|---|---|---|
No | No | No | 7 | #808080 |
No | No | Yes | 7 | #bbbbbb |
No | Yes (1) | No | 7 | #808080 |
No | Yes (0) | No | 103 | #808080 |
No | Yes (1) | Yes | 7 | #bbbbbb |
No | Yes (0) | Yes | 103 | #808080 |
Yes (8) | No | No | 8 | #808080 |
Yes (8) | No | Yes | 8 | #bbbbbb |
Yes (8) | Yes (1) | No | 8 | #808080 |
Yes (8) | Yes (0) | No | 8 | #808080 |
Yes (8) | Yes (1) | Yes | 8 | #bbbbbb |
Yes (8) | Yes (0) | Yes | 8 | #bbbbbb |
Summary:
So this pretty much confirms what we see with glutin
. Granted, all of this maybe because my PC has some stupid driver issue, but it's still odd.
1) If WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT
is not listed, then it's assumed to be true
. This is why if you then use glEnable(GL_FRAMEBUFFER_SRGB)
, you will see sRGB output (which is only possible if WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT
is true
and glEnable
is used).
2) If WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT
is listed, then the behavior is the same as what you'd expect - the glEnable
call will then combine with the value of WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT
to determine if sRGB needs to be shown.
3) If WGL_ALPHA_BITS_ARB
is listed, then WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT
is assumed to be true
, regardless of what the listed value says (we already know it's true
if not listed). Again, this is why the sRGB output is solely controlled by the glEnable
call.
@ZeGentzy Thoughts?
Silly driver issues are silly. Not much we can do. Please file this as a PR: https://github.com/rust-windowing/glutin/issues/1175#issuecomment-506984460
I'm opening separate issues for X11 and macOS, so that this issue doesn't get too cluttered.
Edit: anyways, I'll close this issue as wontfix (as there is nothing we can do) once you got that PR filed. Thanks for your time :)
Appoligies, @LukasKalbertodt, you sorta got drowned out.
2- You are not enabling
GL_FRAMEBUFFER_SRGB
when intending to use sRGB. From theARB_framebuffer_sRGB
extension:I'm not quite sure what you want to say by that, sorry! This tiny program I posted is just to test what various sources report about the framebuffer.
What I'm saying is, while that code is satisfactory for testing if the FBO is linear/srgb, actually drawing to it would cause silly stuff. I just wanted to point that out, altho it wasn't not actually relevant to what it was demonstrating.
1. Do you agree that what I'm reporting is a bug (somewhere)? As in: the example programs I'm posting with `with_srgb(false)` and a standard shader that returns `0.5` should result in an image that is `#808080` and _not_ `#bbbbbb`? This behavior is unintended, right? It's not my fault, right?
Yeah something looks wrong, altho I'm not all too familiar with how sRGB works. Yeah, it's not your fault.
2. Does this happen on your machines, too? Or is that just something that happens to me?
Yeah, I've experienced sRGB issues. I usually fiddle with the supplied colors until I get my expected color.
3. What are the next steps? I have no clue about the `glutin` codebase, but can help somehow?
Most likely there is little we can do. If the drivers are broken, the drivers are broken. Feel free to scheme up workarounds like that in https://github.com/rust-windowing/glutin/issues/1175#issuecomment-506984460. If they aren't too intrusive, I'm fine with including them.
@ZeGentzy I'll have that PR ready by tomorrow. It was fun investigating this; I hope to be able to contribute more to the project!
@LukasKalbertodt With my PR, sRGB rendering will be controlled by a glEnable(GL_FRAMEBUFFER_SRGB)
call and NOT by the with_srgb
parameter. This is because with the current default alpha value, with_srgb
will always be assumed to be true
.
@LukasKalbertodt With my PR, sRGB rendering will be controlled by a
glEnable(GL_FRAMEBUFFER_SRGB)
call and NOT by thewith_srgb
parameter. This is because with the current default alpha value,with_srgb
will always be assumed to betrue
.
Well, with your specific drivers, yes. Other drivers might behave differently, so users should still set with_srgb
appropriately.
@ZeGentzy Well that's the thing: I tested this with Nvidia, Intel and AMD and got the same results on all three when testing with Windows 10 (with the latest drivers). OP's Intel behaves the same way.
All this makes me think that it's at least partially due to WGL, but I could be wrong. So yeah, I don't think this has anything to do with the driver version.
@sumit0190 Older/newer implementations might behave differently, idk. Maybe I'm reading the extension wrong. Maybe none of the drivers bothered caring. Nevertheless, users should set with_srgb
correctly.
Sorry for the super late reply!
Appoligies, @LukasKalbertodt, you sorta got drowned out.
No problem! Thank you two for investigating this so quickly!
I unfortunately do not understand everything you wrote as I have no idea about the windowing site of OpenGL. As I understand all comments: "drivers are bad & strange, set with_srgb()
correctly but assume it won't be used on some machines". I will continue to tell glium that my shader outputs sRGB (which is technically a lie) such that GL_FRAMEBUFFER_SRGB
is not enabled by glium to avoid color conversion.
So I think this all is pretty unfortunate because AIFACT most glium programs on most machines are just "wrong" then. Meaning: colors are incorrectly converted. I mean from the artistic point of view it doesn't matter because you just tweak your colors until you like it. But returning 0.5
from the pixel shader and not getting #808080
is really not great IMO. However, this is not meant as a critique, just as a statement of how unfortunate this is! Thanks again for helping out!
@LukasKalbertodt Is this still an issue? I noticed that when I render a shader from shadertoy using glium, it looks brighter. Is that because of this sRGB issue? What's the recommended fix?
On Shadertoy:
Using glium:
I just tested this again with more or less the code from my first comment:
And I indeed still get exactly the same colors for the four scenarios I described. So yes, this issue still exists as far as I can tell. I just tested on Ubuntu 20.04 now, though.
@LukasKalbertodt In my situation, it worked as expected to add outputs_srgb: true
for all shaders, but is there a solution that doesn't require this?
Btw, why does no conversion from sRGB to RGB happen in the case of "with_srgb(false) and outputs_srgb: true"?
@Boscop Yes adding outputs_srgb: true
to all shaders works. And after dealing with all this color space issue for some time, I believe this is even the "correct" solution in many situations (just as an FYI, for anyone stumbling upon this issue).
Regarding your question, see my first comment, in particular:
as above, the framebuffer should be linear RGB, meaning that no conversion should happen, regardless of
GL_FRAMEBUFFER_SRGB
.
@LukasKalbertodt Thanks, I read that, but why does that mean that no conversion happens (if the framebuffer is RGB but the shader outputs sRGB)?
To be honest, I already forgot most of the details. I can't tell you why this is, but that's apparently how OpenGL works. For example. also see this StackOverflow answer.
Any writes to images that are not in the sRGB format should not be affected.
Fixed in #1435, I guess.
Hello there!
I have found a few other issues related to sRGB, but they simply do not help me and I am still incredibly confused. My problem is that
ContextBuilder::with_srgb
does not seem to have any effect. Check out this minimal example (usingglium
):There are two places of interest which I will explain later.
.with_srgb(_) // MARK A
outputs_srgb: _, // MARK B
My test setup is the following: I run this program and take a screenshot and then inspect what color value the pixels have. For the four possible configurations, I get the following values:
.with_srgb(false)
.with_srgb(true)
outputs_srgb: false
#bbbbbb
#bbbbbb
outputs_srgb: true
#808080
#808080
So as far as I understand: there is the
GL_FRAMEBUFFER_SRGB
flag. If that flag isfalse
, OpenGL does not perform any conversion from fragment shader to frame buffer. If it is enabled, however, OpenGL assumes that the shader output is linear RGB and will thus -- if the frame buffer has an sRGB format -- convert the shader output to sRGB. Inglium
,GL_FRAMEBUFFER_SRGB
is controlled by theoutputs_srgb
parameter of the program. If the latter isfalse
,GL_FRAMEBUFFER_SRGB
is set totrue
and the other way around.Additionally, I would expect
glutin
to create a framebuffer with the format specified by.with_srgb(_)
. As such, I have the following expectations:with_srgb(false)
andoutputs_srgb: false
: the framebuffer should be linear RGB, meaning that no conversion should happen, regardless ofGL_FRAMEBUFFER_SRGB
. As such, I would expect#808080
, but I got#bbbbbb
.with_srgb(true)
andoutputs_srgb: false
: the framebuffer should be sRGB and since the shader does not output sRGB, I expect OpenGL to convert. As such, I expect#bbbbbb
, which I actually got.with_srgb(false)
andoutputs_srgb: true
: as above, the framebuffer should be linear RGB, meaning that no conversion should happen, regardless ofGL_FRAMEBUFFER_SRGB
. As such, I would expect#808080
which I luckily also got.with_srgb(true)
andoutputs_srgb: true
: the framebuffer is sRGB, but the shader also outputs in that color space, so no conversion should happen. As such I expect#808080
which I got.This issue is about the first situation. As far as I can tell, this is just wrong.
I tested the above on several platforms: Ubuntu 18.04, MacOS and Windows. I always got the same results (well, on MacOS and Windows the
#bbbbbb
was slightly off, but still way more than#808080
).(I also did a test with GLFW in case that's interesting. I used the
simple
example and changed line 63 to" gl_FragColor = vec4(vec3(0.5), 1.0);\n"
. Running that, I get a#808080
triangle.)Am I doing something wrong? Am I completely misunderstanding color spaces? Is this a bug in
glutin
/glium
/...? Would be super great if someone could help me!