o3de / o3de.org

The O3DE website
Other
81 stars 159 forks source link

[DOCS] Explain the ACEScg color space and its effect on shading #1618

Open bitinn opened 2 years ago

bitinn commented 2 years ago

Describe the issue briefly

O3DE is perhaps the first major engine to assume shading to happen in ACEScg color space, but very little is written to explain them, things like potential performance cost, what might break if you write a shader by hand (PBR and NPR).

One thing I note is, operation as simple as an Overlay blend, require conversion from ACEScg back to linear sRGB to perform. (See BlendUtility)

Which page(s) / section(s) are affected?

https://www.o3de.org/docs/atom-guide/look-dev/color-management/ https://www.o3de.org/docs/atom-guide/features/ https://www.o3de.org/docs/atom-guide/look-dev/materials/pbr/

Does this work have an engineering dependency? What is it?

No

sptramer commented 2 years ago

@o3de/sig-graphics-audio - Your SIG will need to be a stakeholder, since this is likely related to implementation details and some research that the team has on hand.

HogJonny-AMZN commented 2 years ago

Hi @bitinn

ACES is a industry standard color workflow for a world of media on SDR, Cinema, and HDR10+ display technology, it is now being adopted by games where it provides similar benefits to overlapping color management and display problems. It is important to remember that ACES is a color workflow framework. It offers reference transforms that cover SDR, 1000 nit, 2000 nit, and 4000 nit display output levels. However, it is expected that additional ODTs will be created for additional devices (this is where it attempts to solved extensible future proofing.)

Understanding ACES (Link)

Below is the general approach we have taken (it's not the only possible approach) it's a good compromise of quality and game dev needs, it is paraphrased from Nvidia information on the topic (and it's a gist I may not have every technical detail correct.)

General Recommendation for render pipeline: HDR Display Pipeline, Critical path to UHD for 9th Gen Game Production:

It is ideal to render in a linear scene referred color space such as ACEScg, this is the color space where lighting and color grading occur (but the question is where and how do we transform all color into this space in a efficient and performant manner?)

Note that the true value proposition here are things like:

Note (this is important) from Nvidia: "To ease the application of the ACES-inspired system to games, we have created an ACES pipeline that is parameterized to be able to implement a wide range of the reference ACES ODTs as well as additional tweaks to handle cases beyond the reference ODTs." and referring to ODTs "In that spirit (ACES as reference), we have created a parameterized version of the reference ODTs. The parameterized implements the stages of the standard ODTs, but allows them to be modified to handle a wider range of devices. It also helps simplify the shader development and maintenance."

The O3DE Display Mapper and ACES implementation is based on Nvidia reference code.

This file is a good example of the use of color space transforms for HDR Color Grading: "o3de\Gems\Atom\Feature\Common\Assets\ShaderLib\Atom\Features\PostProcessing\HDRColorGradingCommon.azsl"

There are other color spaces, and sometimes operations are best performed in another space, but the results end up in ACEScg before the frame moves onto the final RRT/ODT to the display device. Note: color science and color management is a deep topic, there is a lot of available information and documentation on the internet that covers it and you can join the ASWF slack and community for additional help and resources:

Consider these docs as required reading (this is what I used to gain an understanding and define the requirements for the product):

  1. https://on-demand.gputechconf.com/siggraph/2016/presentation/sig1611-thomas-true-high-dynamic-range-rendering-displays.pdf
  2. https://developer.nvidia.com/sites/default/files/akamai/gameworks/hdr/UHDColorForGames.pdf
  3. https://www.toadstorm.com/blog/?p=694
  4. https://chrisbrejon.com/cg-cinematography/chapter-1-5-academy-color-encoding-system-aces/
bitinn commented 2 years ago

@HogJonny-AMZN

Thx, I had read link 2, 3 & 4 before.

  • Create most content with sRGB primaries as done today for LDR
  • Render high-quality HDR using physically-based shading
  • Render linear scene RBGA16f
  • Transform linear scene into ACEScg scene referred color space (IDT)
  • Apply color grading to the rendered scene referred image (look modification)
  • Tone map with a filmic ACES-derived tonemapper (RTT)
  • Composite 8-bit sRGB referenced UI as normal, and encode EOTF|PQ etc. (ODT)
  • Keep backbuffer in FP16 scRGB

However, I believe your answer contradicts with this sentence from O3DE doc:

Colors are stored as linear sRGB on disk and later converted to ACEScg before passing to the shaders and the GPU.

https://www.o3de.org/docs/atom-guide/look-dev/materials/pbr/#base-color

Your answer suggests O3DE still does most of its rendering in linear sRGB. IDT in your workflow is applied right before color grading and tone maps. This is different from so-called full ACES workflow in many DCC, where the working color space can be set to ACEScg, and IDT is applied on asset import or texture sampling.

What you describes is quite common among modern engines, and we don't even need RGBA16f during shading, if we just want to tonemap HDR luminance at the end, smaller format such as R11G11B10 is perfectly fine.

What requires RGBA16f (besides alpha composition), is wider gamut rendering, eg. using something like ACEScg as working color space. Both O3DE doc and BlendUtility code suggest this is a possibility. Hence this marketing line of O3DE:

Another significant change is the Atom Renderer, which, as expected, is delivered as a Gem. This renderer supports multiple platforms by providing a modern physically based renderer (PBR) that is ACES colorspace-compliant

https://aws.amazon.com/blogs/gametech/open-3d-engine/

But if IDT is only done before color grading (and I don't think this step should be called IDT, per definition IDT is only for input device), then I think O3DE is no different from other engine, which does this before grading / tonemap as well.

HogJonny-AMZN commented 2 years ago

@bitinn O3DE does not yet have fully implemented color management, although we have some idea there of where to head (OpenColorIO) and have an internal planning doc that should surface as a RFC. You're correct that we don't use OCIO directly as color management yet, we are not defining the working color space in a data-driven manner. We do transform the linear sRGB values to ACEScg in the shading code.

"Your answer suggests O3DE still does most of its rendering in linear sRGB." Most of the rendering is within the ACEScg space, including color grading. We may have originally rendered to linear sRGB and transformed the frame into ACEScg, however incremental improvements have been moving us closer to proper color management. Currently color and textures are generally stored as linear sRGB, and the color space transform happens earlier (I guess you could call this the IDT in a sense)

If you look at StandardPBR for example, you'll see where the base color is transformed: "Co3de\Gems\Atom\Feature\Common\Assets\Materials\Types\MaterialInputs\BaseColorInput.azsli"

float3 GetBaseColorInput(Texture2D map, sampler mapSampler, float2 uv, float3 baseColor, bool useTexture)
{
    if(OverrideBaseColorEnabled())
    {
        return GetBaseColorOverride();
    }

    if(useTexture)
    {
        float3 sampledAbledo = map.Sample(mapSampler, uv).rgb;
        return TransformColor(sampledAbledo, ColorSpaceId::LinearSRGB, ColorSpaceId::ACEScg);
    }
    return baseColor;
}
HogJonny-AMZN commented 2 years ago

@santorac and @invertednormal can chime in here with more technical specificity

santorac commented 2 years ago

Yes as you pointed out, for textures it samples assuming the sample will be linear sRGB (the image is stored as linear sRGB, or stored as sRGB and converted to linear by the sampler), then converts to ACEScg.

For color values, like for materials and lights, they are stored on disk as linear sRGB and then convert to ACEScg right before sending to the GPU. See Material::SetShaderConstant and ***LightFeatureProcessor::SetRgbIntensity. (this is what the docs are referring to where it says "Colors are stored as linear sRGB on disk and later converted to ACEScg before passing to the shaders and the GPU")

Regarding the Overlay and LinearLight blend modes, maybe someone could find a way to do those blends in ACEScg and get similar results to sRGB. Or at least someone could add a shader option so users can choose between performance vs doing it in sRGB.

bitinn commented 2 years ago

@HogJonny-AMZN @santorac

Thx for your responses, so it appears the doc is correct, and shading is done in ACEScg color space.

I would like to know:

It would be lovely if these questions are addressed in the doc.

Additional reading:

HogJonny-AMZN commented 2 years ago

"Whether this means a RGBA16f color buffer is required for deferred rendering, because lower precision wouldn't be able to fully support wider gamut."

We need a 16 bit buffer because of linear light values, which is super common for any modern renderer. Once you have linear light values, really all ACEScg does is give you a wider color gamut to work with - ACEScg primaries cover a much wider gamut than the sRGB primaries, this results in better color precision and reproduction.

The current rendering pipeline is forward+, not deferred. Someone could write a deferred rendering pipeline though.

"What was the initial reasoning behind using ACEScg as the default rendering color space?"

The reasoning is industry standards for VFX, games and other entertainment and use cases and future proofing the work on a new core renderer.

ACES is important because color reproduction on monitors has gotten much better over the last few years and most modern tvs / monitors support a much wider color gamut then sRGB - ACEScg is meant to be a standard wide color gamut that should be pretty future proof because it contains this wider range. The ACEScg primaries are a beneficial compromise, which is wide gamut (covering much more of the visible spectrum) but closer to color output capabilities of modern viewing devices. The idea being that you get the best fit for the display device capabilities, with the highest quality color reproduction across varied display types.

As stated, ACES is the industry standards for HDR color workflows. It's even what Nvidia recommends for this gen of development. It's been used with some games already, even Unity and Unreal are moving this direction. Yes that gives you robust wysiwyg color across the dcc toolchain, in post, etc. the concept of configuring your renderer and all of your tools to conform to color management is also very valuable in production. We can't configure the renderer yet, but you can at least use OCIO to configure all of your compatible tools to be close to the renderer.

"to reduce precision lost during shading?"

Yes, you get the best color reproduction whether rendering to a LDR or a HDR display, it is a wider gamut color space then sRGB. The reproduction of color will be more accurate and precise, and this should be true on any display. The ACES documentation has examples of how and when this is a clear improvement. (and go watch this)

"Are there gotchas we should be aware of when shading in ACEScg as opposed to linear sRGB?"

Not sure what you would mean by a gotcha, other then you should transform into the ACEScg color space and gamut to perform most color based operations. This isn’t always the case though. The overlay blend for example is currently converted to regular sRGB because overlay blending expects things in the 0-1 range (and it's regular sRGB as opposed to linear sRGB because that's the space that these blend operations normally occur in for image editing tools like Photoshop, and we're trying to be consistent with that experience because it's intuitive and familiar for this type of color operation). We could do overlay blending with ACEScg primaries, but you'd still need to make sure you were in the 0-1 range with a reasonable gamma for it to work and remain intuitive feeling.

It may be worth looking into where we're using blend modes in our shaders and keeping baseColor in sRGB space until after all blending is done, then convert it to ACEScg. We are still early, so this may be an area to consider improvements.

“Are there any additional cost incurred when outputting ACEScg result to screen“

Our implementation is version of ACES based on Nvidia’s parameterization and reference code, it’s the most accurate and this has a performance cost. There is also a ACESlut based approximation which can be enabled in the Display Mapper, which is less accurate but will have better performance. Note, this all occurs in the Display Mapper which is extensible, for instance theoretically someone could implement a less accurate best fit (something like this (https://knarkowicz.wordpress.com/2016/01/06/aces-filmic-tone-mapping-curve/))

“for example, does o3de simply clip the extra gamut?“

The ACES tonemapping and RRT/ODT deal with HDR brightness and gamut, when output to display it’s transformed into the best fit for the target display, it’s not clipped it’s remapped. Each ODT provides the best possible reproduction of color for the displays target luminance and gamut. (to my knowledge this is a mix of color science and expert human eyeballs determining what best means.)

“does it do additional gamut mapping beyond standard color grading / tone mapping?“

Yes ACES has a standardized approach to gamut mapping and tone mapping, but in ACES color grading and tone mapping are probably very different to what you may be used to. Look Modification (LMT) in ACES is color grading and other look based color modifications, however they are done in the linear scene refereed color space (there are variations, but Atom uses ACEScg as the linear wide gamut floating point HDR color space) and this occurs beneath the RRT/ODT and tone mapping into the display refereed space (the monitors gamut and color space). This is different then non-ACES workflow in previous generation renderers, where you would expose, color grade and tone map the final image in the sRGB output display space itself commonly utilizing a baked LUT (however, there is a mode in the Display Mapper that would still let you do that but it has limitations. see comments below.)

The “HDR Color Grading Component” facilitates doing correct look based color grading within the ACES workflow, and allows you to also create LMT LUTs compatible with the ACES workflow and can be loaded into a “Look Modification Component”, this grading occurs before the Display Mapper which applies the RRT/ODT and ACES tone mapping for output on the viewing device.

Without the HDR Color Grading Component and rendering features, doing this is not only complex but tedious and complicated

"I have been told by people familiar with lumberyard that it can switch back to linear sRGB shading when required, is that an option in o3de or does it require modifying sources?"

This isn't Lumberyard, Atom is an entirely new renderer and the default rendering pipeline is configured around the ACES color managed workflow and config 1.0.3, as this provides the highest degree of accuracy and compatibility across a much bigger range of display technologies. We have many uses cases including not-games, and we wanted to make sure the initial core offering is the best possible baseline (and future proof.)

Ideally you would prefer ACES even on older sRGB/rec.709 displays, as it has far superior color reproduction and improved tone mapping (although of course that may be debatable and subjective, but we agree with the experts in this field.)

Atom can display on a sRGB LDR/SDR display, or a range of HDR displays at various nit levels of brightness. You should theoretically be able to author all on standard sRGB monitors, but get all of the benefits of and look better on any HDR display your project is displayed on. Is there some reason that you want to be back in and constrained by a last gen linear sRGB render pipeline, because you wouldn’t be actually fully utilizing the capabilities of any HDR display in that situation?

If you add the Display Mapper level component, you can see the available settings, including "passthrough" (no ODT or ACES tonemapping), a simple GammaSRGB output, similarly a last gen Reinhardt tonemapper, and also the ability to enable oldschool LDR tonemapping and previous gen color LUTs (the closest to what I think you are expecting)

image

HogJonny-AMZN commented 2 years ago

No additional response or request for additional info so closing this

sptramer commented 2 years ago

Going to reopen for some additional evaluation by sig-docs-community now that the thread is completed - this is a lot of information on the colorspace and the decision behind using it that might be valuable in official docs.

bitinn commented 2 years ago

Sorry for the late reply, the answer was long and mixed with buzzwords so I didn't write down my thoughts upon my initial read:

Based on what I read:

Thx for all involved.

sptramer commented 2 years ago

@aFinchy - Candidate for blog post about the decision to use ACEScg, documentation team to select and advise in official docs on best practices for the colorspace + color grading needs.

HogJonny-AMZN commented 2 years ago

Your opinions are are also valid, take them to the Sig-Graphics-Audio meetings and have this conversation with the larger group.

Ideally if we carried on with our plans for full color management and also made the renderer configurable (OpenColorIO) then the choice of a working color space would be configurable, color space transforms more flexible, and thus a less opinionated renderer.