Closed milasudril closed 6 years ago
I am guess this is for is in software that is not Blender?
On Aug 23, 2017 1:43 PM, "Torbjörn Rathsman" notifications@github.com wrote:
I want to implement something like "Filmic Blender" in GLSL. Thus I have some question about the details on how the transformation works. I have found that there are two sets of LUT:s. RGB to RGB mapping, and a 1d mapping for controlling contrast:
- How are input values (those that comes from the lighting algorithm) mapped to the range [0, 64]
- When is the contrast control applied
- How does the exposure control work
Notice that reusing the same LUT may not be necessary, so some references with formulas would also be interesting.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/sobotka/filmic-blender/issues/38, or mute the thread https://github.com/notifications/unsubscribe-auth/AOeua7YO74AEfWkkP0ByrLVN5rwOq5Dyks5sbIDVgaJpZM4PAfY- .
I'm guessing so.
The problem with screen space shaders is that their quality depends on the raw buffer they're transforming. For example, 3D rendered in immediate mode most always assume a Gamma of 2.2, whereas with defered shading the gamma transform is applied as the last step.
I assume Filmic Blender uses the very basic "1 photon + 1 photo = 2 photons" and a linear operation, whereas most realtime 3D already has the logarithmic assumption of "100 photons + 1 photon is only a 1% increase". MinutePhysics explains best: https://www.youtube.com/watch?v=LKnqECcg6Gw (replace Photoshop with realtime 3D and blur with post-processing in general).
You should be able to find lots of external resources about HDR and tonemapping in GLSL. For example, https://learnopengl.com/?_escaped_fragment_=Advanced-Lighting/HDR#!Advanced-Lighting/HDR ~Anything relating to this project specifically you could glean from the source code, no?~ (My mistake, this project doesn't provide any source code.)
@echuber2 I have already read that, but their exp mapping is 1d, which still fails under any circumstances. I tested it, and yes, you get the ugly hue shift when increasing the light source.
The shader is for a game engine I am working on in my spare time. If you want to know, it will be released under GPL3. The raw output from the shader is linear. This means that any sRGB textuers are converted to linear half
format.
You need 1D LUTs to shape the intensity channels properly. Either that or write the curve in code and have your game engine do it on the fly. LUTs will be faster.
The general idea is pretty simple:
If you dump the 1D LUTs into a spreadsheet, you can look at the curves. The Log2 ranges are defined by the AllocationTransform.
Also bear in mind that you would need to account for the primaries of the reference space and output device. If one assumes REC.709 primaries for the reference and output device, this is a no-op. If any combination is different however, you'd need to properly transform for primaries. This includes the more complex case of HDR displays for example.
So, the order of operation is
frag_color=to_srgb( contrast_curve( destaturate( log2( hdr_input) ) ) );
There is no sRGB anywhere.
The contrast curve is designed for a power 2.2 device.
But otherwise I understand it is correct?
This particular pipe implements a log, then desaturation, then contrast. Applied and mastered for a 2.2 power function display when an sRGB display device is selected.
This particular pipe implements a log, then desaturation, then contrast. Applied and mastered for a 2.2 power function display when an sRGB display device is selected.
Good explanation. This is exactly what I needed.
I want to implement something like "Filmic Blender" in GLSL. Thus I have some question about the details on how the transformation works. I have found that there are two sets of LUT:s. RGB to RGB mapping, and a 1d mapping for controlling contrast:
Notice that reusing the same LUT may not be necessary, so some references with formulas would also be interesting.