Open StarLederer opened 1 year ago
This is a solid writeup. I like that there's a clear spec to follow and link to. Your proposals around configurability are good too.
@StarLederer For reference, this is what Baking Lab's implementation looks like with fully saturated colors:
I`m here to mentioned that it should already be implemented at render time so that the calculations can happen in ACESCG for more accurate results.
I suspect it might be a myth that simply working in ACESCG gives more accurate results though I remember seeing some youtube video that claimed that it did in some rendering software. We should be just limited to the accuracy of floating-point numbers, it shouldn't matter what wavelengths you consider to be your 3 primaries internally, especially when you don't know how well they match with the output device.
The main benefit of my proposal, in my opinion, is effortless HDR support (developers should be able to build supporting graphics without even realizing it), as well as support for future display technology advances.
I must apologize to @CptPotato because I missed that reply from 2023. That implementation looks pretty good, I never said otherwise, but I believe the actual transform developed by AMPAS can be even better because it was developed by collaborating industry experts, however I agree that this is probably not critical. What I do think is critical is that the BakingLab transform assumes an sRGB output device and will look really bad simply stretched onto an HDR display. I tried finding a source but I am unable to, there was a 3kliksphillip video about CS2, when it just came out, that mentioned that it supported HDR but, when enabled, looked so bad that he could not imagine anyone playing with it. I cannot claim that this is because of any specific mistake in internal logic as we don't have access to the source (haha, pun), but this is definitely an example of how the user feels when HDR is not done properly.
What problem does this solve or what need does it fill?
Currently Bevy has an sRGB-dependent visually sub-optimal look, this is a barrier to implementation of HDR display support and makes images rendered in bevy look like they are viewed through an old crappy digital camera. There is an ongoing effort at addressing this issue in #6677, however it proposes a very limited approximation of a well known color grading workflow which has visual artifacts and does not completely succeed at creating the desired filmic look.
What solution would you like?
I would like to see an implementation of the ACES workflow in Bevy's tonemapping / post-procesing / rendering system because it solves both problems outlined in this issue using a combination of an intermediary-color space, display-specific transformations and a look modifier that creates the desired filmic effect.
"ACES is a free, open, device-independent color management and image interchange system that can be applied to almost any current or future workflow. It was developed by hundreds of the industry’s top scientists, engineers and end users, working together under the auspices of the Academy of Motion Picture Arts and Sciences" (AMPAS, n.d).
ACES solves Bevy's sRGB dependence with its intermediary ACES 2065-1 color-space (also known as ACES AP0). All inputs (in our case just the camera render target) are converted to ACES 2065-1, processed to suit artistic needs, and displayed to the user using an Output Device Transform (ODT) that corresponds to their display type. This step allows any developer / artist to work on any kind of display and be sure that the end user result is going to be perceptually similar to ther artistic intention no matter if they are viewing it on a CRT display or an HDR-enabled TV (as long as the correct ODT is selected).
The filmic look is created by another component of ACES, the Reference Rendering Transform (RRT). The main function of RRT is to adapt the ACES 2065-1 color-space to human eyes for critical color evaluation (AMPAS, 2013, p. 8). Coincidentally, this is achieved by mapping the colors in a way that emulates how the image would have looked had it been captured on film and creates a really pleasant look. RRT is highly regarded in the games industry and is applied by default in Unreal Engine (Brackeys, 2017). Admittedly, RRT is not without criticism (e.g Stout, 2022; Meeting Summaries, 2023), however RRT remains the standard proposed by AMPAS.
6677 is already proposing an approximation of the look usually achieved by the ACES workflow, however, it does not solve dependence on sRGB, and (subjectively) fails to recreate the famous ACES look.
I would like a solution that implements a transformation to the ACES 2065-1 color-space, optionally (and by default) followed by the RRT, followed by a user / developer selected ODT. That way there are no hard-coded assumptions of an output display type, which opens up possibilities to handle HDR displays and / or unexpected use-cases like VFX rendering for movies, and the desired filmic look is faithfully recreated.
AMPAS offers a reference implementation of most steps outlined above.
What alternative(s) have you considered?
6677 makes a step in the right direction but it implements an bad approximation of the ACES look and does not implement the ACES workflow in any capacity.