sobotka / filmic-blender

Film Emulsion-Like Camera Rendering Transforms for Blender
https://sobotka.github.io/filmic-blender/
2.07k stars 147 forks source link

Filmic Blender as GLSL #38

Closed milasudril closed 6 years ago

milasudril commented 6 years ago

I want to implement something like "Filmic Blender" in GLSL. Thus I have some question about the details on how the transformation works. I have found that there are two sets of LUT:s. RGB to RGB mapping, and a 1d mapping for controlling contrast:

Notice that reusing the same LUT may not be necessary, so some references with formulas would also be interesting.

Blendify commented 6 years ago

I am guess this is for is in software that is not Blender?

On Aug 23, 2017 1:43 PM, "Torbjörn Rathsman" notifications@github.com wrote:

I want to implement something like "Filmic Blender" in GLSL. Thus I have some question about the details on how the transformation works. I have found that there are two sets of LUT:s. RGB to RGB mapping, and a 1d mapping for controlling contrast:

  • How are input values (those that comes from the lighting algorithm) mapped to the range [0, 64]
  • When is the contrast control applied
  • How does the exposure control work

Notice that reusing the same LUT may not be necessary, so some references with formulas would also be interesting.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/sobotka/filmic-blender/issues/38, or mute the thread https://github.com/notifications/unsubscribe-auth/AOeua7YO74AEfWkkP0ByrLVN5rwOq5Dyks5sbIDVgaJpZM4PAfY- .

SolarLiner commented 6 years ago

I'm guessing so.

The problem with screen space shaders is that their quality depends on the raw buffer they're transforming. For example, 3D rendered in immediate mode most always assume a Gamma of 2.2, whereas with defered shading the gamma transform is applied as the last step.

I assume Filmic Blender uses the very basic "1 photon + 1 photo = 2 photons" and a linear operation, whereas most realtime 3D already has the logarithmic assumption of "100 photons + 1 photon is only a 1% increase". MinutePhysics explains best: https://www.youtube.com/watch?v=LKnqECcg6Gw (replace Photoshop with realtime 3D and blur with post-processing in general).

echuber2 commented 6 years ago

You should be able to find lots of external resources about HDR and tonemapping in GLSL. For example, https://learnopengl.com/?_escaped_fragment_=Advanced-Lighting/HDR#!Advanced-Lighting/HDR ~Anything relating to this project specifically you could glean from the source code, no?~ (My mistake, this project doesn't provide any source code.)

milasudril commented 6 years ago

@echuber2 I have already read that, but their exp mapping is 1d, which still fails under any circumstances. I tested it, and yes, you get the ugly hue shift when increasing the light source.

milasudril commented 6 years ago

The shader is for a game engine I am working on in my spare time. If you want to know, it will be released under GPL3. The raw output from the shader is linear. This means that any sRGB textuers are converted to linear half format.

sobotka commented 6 years ago

You need 1D LUTs to shape the intensity channels properly. Either that or write the curve in code and have your game engine do it on the fly. LUTs will be faster.

The general idea is pretty simple:

  1. Make a Log2 transform of the data range in question.
  2. Apply a desaturation transform to handle the graceful desaturation. This is a 3D LUT or some other approach.
  3. Make an aesthetic curve that adjusts the contrast so that it is close to what you want.

If you dump the 1D LUTs into a spreadsheet, you can look at the curves. The Log2 ranges are defined by the AllocationTransform.

Also bear in mind that you would need to account for the primaries of the reference space and output device. If one assumes REC.709 primaries for the reference and output device, this is a no-op. If any combination is different however, you'd need to properly transform for primaries. This includes the more complex case of HDR displays for example.

milasudril commented 6 years ago

So, the order of operation is

frag_color=to_srgb( contrast_curve( destaturate( log2( hdr_input) ) ) );
sobotka commented 6 years ago

There is no sRGB anywhere.

The contrast curve is designed for a power 2.2 device.

milasudril commented 6 years ago

But otherwise I understand it is correct?

sobotka commented 6 years ago

This particular pipe implements a log, then desaturation, then contrast. Applied and mastered for a 2.2 power function display when an sRGB display device is selected.

milasudril commented 6 years ago

This particular pipe implements a log, then desaturation, then contrast. Applied and mastered for a 2.2 power function display when an sRGB display device is selected.

Good explanation. This is exactly what I needed.