Closed batstick closed 6 years ago
I can reproduce it here. Strange, we are using the same output from LuxCore (RGB(A)_IMAGEPIPELINE) for both. Maybe OpenGL is clamping the values into 0..1 range when we assign them to the texture for drawing in the viewport? Or the Blender "display space shader" that applies the colorspace does the clamping? Or it's something else. This is the relevant file: https://github.com/LuxCoreRender/BlendLuxCore/blob/master/draw/__init__.py
It is indeed OpenGL which clamps the values into 0..1 range (with format GL_RGBA): https://www.khronos.org/registry/OpenGL-Refpages/gl4/html/glTexImage2D.xhtml
Fixed it, we need to use internal_format = GL_RGBA32F
System Information
Win 10 Pro, Radeon Pro Duo x2, GTX 980 for Display
Software Version
Short description of error
Adjusting the exposure/gamma in the viewport does not work properly. It seems that the full range of brightness is not being used? I have attached a collage of examples. The top is a final render. Bottom is a viewport of the same scene. Notice how the light bulb itself is so much brighter that the exposure had to be lowered to -8 for it to get dimmer at all. In the viewport everything starts getting uniformly dimmer the moment the exposure is lowered.
Exact steps for others to reproduce the error
All you have to do is play with the exposure with a final render and the viewport open at the same time to see the difference. I have attached a sample blend file. room01.zip
The issue occurs only on the RT Path CPU + RT Path Path Sampler inside the viewport. All other combinations work properly for the final render.