julianxhokaxhiu / FFNx

Next generation modding platform for Final Fantasy VII and Final Fantasy VIII ( with native Steam 2013 release support! )
GNU General Public License v3.0
354 stars 50 forks source link

[ Common ] Improve the FFMpeg video decoding #536

Closed ChthonVII closed 1 year ago

ChthonVII commented 1 year ago

There are several things wrong here. Since they're in close proximity, it makes more sense to tackle them at once.

  1. The colorspace is never checked. Rather the yuv data is just handed off to FFNx.frag (or FFNx.lighting.frag) and converted to rgb using bt601 matrices. This behavior is wrong in all cases other than bt601.
    • The fix is to explicitly check the video's colorspace and return an error if it's something we can't properly convert to rgb.
    • (swscale would be willing to do a yuv->yuv conversion for us, but it would be incorrect without a gamut conversion (which I don't believe swscale can do) unless the color primaries were the same (which is not true for the most common use case bt709 <--> bt601))
    • We should strongly consider adding bt701 matrices to FFNx.frag (and FFNx.lighting.frag) so we can handle that case too. Many encoding tools already consider bt601 deprecated, and it's only going to get harder for users to generate bt601 video going forward.
    • We should also add some documentation telling users that, at least right now, bt601 is the only colorspace that we even attempt to handle correctly.
  2. Color range is never checked. Instead we're checking if pixel format is yuvj420p and then also using that as a proxy for color range. While it's true that yuvj420p will always be full range, the reverse isn't true. For instance, yuv420p10le could be tv range or full range.
    • The fix is to explicitly check the color range, independently and in addition to the pixel format check, and condition the range conversion on that.
  3. The intended range conversion is never happening. The call to sws_setColorspaceDetails on line 241 always has both source and destination range set to full range (see line 240), resulting in no conversion ever happening.
    • The fix is to use the video stream's range as the source range.
    • This bug is partially compensated for by the bug in #535 which erroneously shifts the y down by 12, putting the black point at 0. This probably made both bugs harder to spot, as looking for the elevated black point is the typical way of eyeballing tv-range output.
  4. Coercing the pixel format to 420 doesn't make a lot of sense to me. I can see the point of coercing it to something, so we don't have to handle multiple cases, but it should be 444 rather than 420. First, if the video stream is giving us more chroma data, we're just throwing it away with 420. Second, as something specifically designed for these sorts of conversions, swscale is probably going to do a better job of resampling than texture2D.
    • We should also have some documentation telling users (specifically SYW) that there's little point in using a 10-bit pixel format, since we're just going to unceremoniously coerce it down to 8-bit.

I can take a first crack at implementing these changes. However, I'm going to need some help, since I lack a suitable Windows build environment and won't even be able to tell if what I wrote compiles.

julianxhokaxhiu commented 1 year ago

Sorry for replying late to this issue.

(1) I don't have a HDR monitor how does HDR look for the videos with with the current PR?

I can test that for you once I know your code base is final. The PR is still marked as Draft so I assume you're still working on it. That's why I didn't take some time to test it. If that's not the case, lmk.

(2) I don't have a detailed grasp of how HDR display is supposed to work

The last time I was doing the implementation I remember HDR basically is based on top of Rec.2020 as color space. How to get there though is the challenge. Usually the game should be based on a 1.0 Gamut ( deep dark ) and it has a wider white based spectrum, in order to create higher contrasts. Although in our own case we don't have that luxury so we should convert it from Rec.709 to Rec.2020.

Now I'm not sure how all your work related to this but if you can help to figure out at least if theoretically all makes sense I could then give it a go and test it.

Resources I've used in the past:

Feel free to dig for more in any case. Also be aware that the conversion code runs for EVERYTHING in the game, including movies as it's in a post shader. In case things need to change for movies only we'll have to see how to adapt it. For now it's a pure theoretical exercise.

Thank you in advance.

ChthonVII commented 1 year ago

Turns out I was able to sort most of "how does this work" stuff in the airport on my way out. So it's been rattling around in my head for a week. Please forgive the possible unclarity of this post, as I'm very jet-lagged right now. I see 5 issues that may need attention:

  1. As I said before, the green patches are likely due to a missing saturate().
  2. The brights are very washed out due to a misunderstanding about what the monitorNits parameter is supposed to mean. According to this source, it should not be the maximum nits the monitor can produce, but rather the maximum nits it should be allowed to produce for SDR content. Theoretically, the correct value should be 100. But that author found 108 was the correct value for their specific monitor through trial-and-error testing. Also, the user might want to sacrifice some accuracy in favor of more brightness.
  3. The banding... a. One part of the problem, the banding near black, is due to using a piecewise gamma function with a "toe slope" when linearizing. Hopefully the new gamma function fixes that for HDR just as well as for SDR. If not, we might have to use a pure 2.2 curve when HDR is on. b. Another major part of the problem is caused by item 2 above. We've got only 256 steps of luma data (or only 219 if the video is tv range) and the rec2084 curve is stretching that out over thousands of steps. So of course there's going to be tons of banding. Fixing item 2 should fix this. c. Finally, the original videos are just pretty banded to begin with, and it probably shows more on HDR. I can try dithering everything when HDR is on as a global banding mitigation.
  4. Presently, by doing a two-step gamut conversion (e.g., NTSC-J to sRGB to rec2020), we are getting zero benefit out of having a wide-gamut monitor. To take advantage of the wide gamut, we need to go back and make everything a one-step conversion from the source gamut directly to rec2020. a. For videos, this will require some refactoring (along with a bunch more pre-computed matrices), but it should be pretty straightforward. b. For everything else, it's not so easy. We're going to need some mechanism for associating gamut metadata with textures. Probably some kind of manifest file included in the iro files. The defaults ought to be NTSC-J gamut for original textures, and sRGB gamut for modded textures with no metadata, and then the mod maker could specify the gamut for each texture if they wished. That sounds like it will require changes to both FFNx and 7th Heaven, which is getting waaay out of scope for this PR. I think the best I can do here is to just set up the plumbing so that textures could use different gamuts if and when the metadata becomes available. (Alternatively, we could do the gamut conversions in advance, but then we'd need an SDR version and a HDR version for each mod.)
  5. If we can sometimes have 10-bit video input and have 10-bit output with HDR, then we really ought to have a full 10-bit pathway available for those cases. This may prove too ambitious a task for this PR, and might need to wait for later. (Also, does anyone know whether ffmpeg's output buffers for 10-bit formats are packed 10 bits or padded to 16, and the same question for bgfx's input buffers?)
julianxhokaxhiu commented 1 year ago

Wow, that's a huge essay so let me try to through step by step:

  1. Is this fixable by code? where does this saturate() end on our own side?
  2. That's basically what we are already doing on the FFNx code post shader basically. But ofc these functions work if the input is known to be an sRGB ( which I think it's our case for most of the stuff we do on movies and through the game ).
  3. Technically I fixed the banding exactly by using that chain of calls and I can assure you black is deep black, it's not gray black for eg. I tested this on my own HDR monitor and other users have to. Only a few one expose the HDR issue like #499 but I'm not sure why. Probably it's related to the PQ part (in code we do try to autodetect the given white point of the monitor but it gives us only the first HDR monitor that is detected, if the user moves the game window on the wrong monitor it will look like that, so most probably it's not really an issue. I'll try to approach the user again )
  4. Let's focus on the video for now, IF there is anything to do on this front. As a matter of fact I was asking you to double check it but without having a proper equipment it would be difficult for you. Don't you have a recent TV at home that has HDR?
  5. I never had a 10-bit HDR movie to test this pathway, otherwise I would have worked the code for it. Unless a modder comes out eventually with a pack, we'll have to postpone it for another moment. Anyway, definitely not for this PR :)

So all in all, feel free to take your time on the PR and let us know when it's ready ( like done, really done, on your own side and no more code patches are expected, so we can start a full review round ).

Thanks again for taking the time to reply and still work on this. Appreciated!

ChthonVII commented 1 year ago
  1. Should be totally fixable. It's going to be one of the HDR-related functions, probably the gamut conversion. Just need to wrap a saturate() around the output of the function.
  2. We're using the wrong value. We're using what dxgi tells us is MaxFullFrameLuminance ("[t]he maximum luminance, in nits, that the display attached to this output is capable of rendering"). But we should be using the SDR white level ("the brightness level that SDR 'white' is rendered at within an HDR monitor"). We can probably borrow this code from Chrome for querying it.
    • Also worth knowing: Aside from not being the thing we need in the first place, dxgi will give out a dummy value of 799 if the monitor's luminance metadata is busted. The dxvk source notes that some LG monitors are busted like this. Maybe that user has an LG monitor? I'd imagine trying to display 799 nits on a monitor that caps out at 400 nits or whatever might cause some issues...
  3. I'll add the dithering, since it can't hurt.
  4. Nope, no HDR anything for me.
  5. All of SYW's movies are 10-bit (despite being tv-range). I can probably sort out the packing/padding questions by trial and error, since error should segfault...
julianxhokaxhiu commented 1 year ago

Ok let's focus on 1 2 and 3 :) They look like good pointers to me. Lmk when you have the patch ready on the PR and I'll be happy to test it. Thank you!

ChthonVII commented 1 year ago

Before I go and break things, I think it's prudent to ask how the shader control flow works for non-movie content. I'm sorry that this basically amounts to asking "how do FF7 rendering and bgfx work?"

Stripped really bare, FFNx.lighting.frag looks like this:

vec4 color = vec4(toLinear(v_color0.rgb), v_color0.a);
if (isTexture)
{
    if (isYUV)
    {
        movie processing
    }
    else
    {
        vec4 texture_color = texture2D(tex_0, v_texcoord0.xy);

        if (isNmlTextureLoaded) color_nml = texture2D(tex_5, v_texcoord0.xy);
        if (isPbrTextureLoaded) color_pbr = texture2D(tex_6, v_texcoord0.xy);

        if (doAlphaTest)
        {
            alpha testing...
        }

        if (isFBTexture)
        {
            if(all(equal(texture_color.rgb,vec3_splat(0.0)))) discard;

            if (!(isHDR)) {
                texture_color.rgb = toLinear(texture_color.rgb);
            }
        }

        if (modulateAlpha) color *= texture_color;
        else
        {
            color.rgb *= texture_color.rgb;
            color.a = texture_color.a;
        }
    }
}

if(isTLVertex)
{
    gl_FragColor = color;
    lighting stuff
}
else
{
    lighting stuff
}

if (!(isHDR)) {
    // SDR screens require the Gamma output to properly render light scenes
    gl_FragColor.rgb = toGamma(gl_FragColor.rgb);
}

First off, thus far I've been operating under the assumption that the isYUV branch is used solely for movies. Is that correct? Is there any other YUV content I might accidentally be breaking?

The control flow for the rest is a bit of a mystery to me. What's coming in via v_color0? In the isTexture case, why are we multiplying v_color0 against color values from a texture? What does isFBTexture mean? Why is this the only type of texture to get linearized? (Doesn't this mean we're multiplying a gamma-space texture_color with a linear color when !isFBTexture?) What gets passed to FFNx.post? Individual fragments or a full finished frame?

I'm really struggling to make sense of the use of gamma functions. Since there's only one unconditional call to toLinear(), everything has to be passing through there on its way to FFNx.post. (Else it wouldn't have linear input.) Does that mean that textures pass through FFNx.frag/FFNx.lighting.frag twice -- once as tex_0 and then again as v_color0? If things are going around twice, why are we linearizing and to-gamma-ing them repeatedly? Can't we just move toLinear() right up behind the texture sampling and move the toGamma() to FFNx.post? Do we have any color data that's not starting its life as a texture (or a movie), that we'd miss linearizing if we did that?

edit: Are textures being linearized somewhere in C++ land at the time when they're first read in?

edit: It looks like, yes, BGFX is linearizing textures when it read them in. That solves the mystery of how stuff is getting linearized ahead of FFNx.post without wild and circuitous control flow. I think I can probably resolve the rest of my own questions with some trial and error testing. Though I would appreciate it if someone could confirm my understanding.

julianxhokaxhiu commented 1 year ago

@ChthonVII I took some time before properly replying to you as this is a massive topic to be replied over here. Overall I'd say if you have no knowledge how GPU development and shaders work, to start with something like https://learnopengl.com/

There you can find everything theoretical you need to understand how each pass works and what does it mean to move things from one shader to the other.

Given that I'll try now to reply to each question briefly.

What's coming in via v_color0? In the isTexture case, why are we multiplying v_color0 against color values from a texture?

For Movies it is always black, but usually it's the color of the polygon you're draying that you can "mix" with the texture color. Through this combination you can alter visually the texture color without requiring to have a different palette per texture ( albeit in FF7 this concept is there, on modern GPUs it is no more ).

What does isFBTexture mean? Why is this the only type of texture to get linearized? (Doesn't this mean we're multiplying a gamma-space texture_color with a linear color when !isFBTexture?)

Frame Buffer texture and it is the texture coming directly from the GPU where we are drawing. This is used in FF7 and FF8 to compose some effects ( for example in FF7 it is used to compose the battle swirl effect ). Overall I'd say you don't need to do anything there.

What gets passed to FFNx.post? Individual fragments or a full finished frame?

Full finished frame.

Can't we just move toLinear() right up behind the texture sampling and move the toGamma() to FFNx.post? Do we have any color data that's not starting its life as a texture (or a movie), that we'd miss linearizing if we did that?

If this is about lighting I'm afraid only @CosmosXIII can provide more context. Regarding moving the logics, as far as I remember the HDR filtering is applied there so the frame comes "natively rendered" based on the framebuffer color type assigned ( for SDR we use one type, for HDR we use another ) and then we apply the HDR "filter" to bump the colors since the game natively renders internally as SDR ( colors are not "boosted" for HDR, unless @CosmosXIII will implement that for lighting, this is also why we do everything in post, because he might want to skip this in the future ).

If you want to understand more how HDR works technically speaking I'd suggest you reading this part: https://learnopengl.com/Advanced-Lighting/HDR

Are textures being linearized somewhere in C++ land at the time when they're first read in?

No, there is no processing happening at the CPU level. Everything is sent to the GPU as it is on the file. Unless the file itself has been "manipulated" to be more brighter or whatever, we on the driver code do nothing special.

CosmosXIII commented 1 year ago

@ChthonVII While @julianxhokaxhiu is right in that we do not do any processing of the textures in the CPU side, we actually set the sRGB flag for color textures here: https://github.com/julianxhokaxhiu/FFNx/blob/master/src/renderer.cpp#L1395 When this flag is set the GPU hardware auto converts from sRGB to linear space when doing a texture fetch. So you may be wondering why not do this on the shader? The reason is that by letting the gpu do it, the linear filtering when fetching between two texels is done correctly. Otherwise it would linearly interpolate between non-linear sRGB values. Another reason is because it’s faster this way.

About moving the toGamma conversion to the post effect, I already tried this but it changes how the alpha blending or additive blending effects look. While doing blending in linear space would be the correct way to do it, it changes too much how the game looks. Probably the original developers worked with blending in sRGB and adjusted the alpha values accordingly.

ChthonVII commented 1 year ago

@julianxhokaxhiu & @CosmosXIII

OK, I think I finally understand:

I'm a bit concerned that there might be a bug with the weird texture blending thing. The current code is correct if the idea is to take two colors and do a multiply blend. But that's weird because multiply blend only ever makes colors darker. To make colors lighter, v_color0 would have to have a value > 1.0. If inputs >1.0 are sometimes to be expected, we shouldn't be linearizing v_color0 when isTexture because the gamma function isn't properly defined above 1.0. Do either of you know if values > 1.0 are ever encountered in this way? Or if the design intent is indeed a multiply blend? Or more like adjusting the texture color by a specified fraction?

When this flag is set the GPU hardware auto converts from sRGB to linear space when doing a texture fetch. So you may be wondering why not do this on the shader? The reason is that by letting the gpu do it, the linear filtering when fetching between two texels is done correctly. Otherwise it would linearly interpolate between non-linear sRGB values. Another reason is because it’s faster this way.

As soon as I finally started to think outside the bounds of the shader, this seemed a very likely explanation. The link in my previous post isn't super clear, but it suggests the sRGB flag does what you confirmed it does.

About moving the toGamma conversion to the post effect, I already tried this but it changes how the alpha blending or additive blending effects look. While doing blending in linear space would be the correct way to do it, it changes too much how the game looks. Probably the original developers worked with blending in sRGB and adjusted the alpha values accordingly.

I was afraid of that.


My last remaining goal is to move the gamut conversions as early as possible so that HDR can get the benefit of its wide gamut if and when we can support gamut metadata for textures. While this is straightforward for movies (since they have metadata already), I now see additional complications for non-movie things, and I'm unsure how to proceed:

julianxhokaxhiu commented 1 year ago

I could put stuff in place and comment it out, so it could be activated later if metadata ever becomes a reality. But @julianxhokaxhiu doesn't much care for having commented code sitting around.

Just to be clear on this one, I'm against code that is commented that it is basically dead and must be removed later on, but if this code is something that makes sense and there is already technical support on top, feel free to put it there as active one ( non commented ) but make sure it gets enabled only if you get the metadata so to say. In pseudo-code:

if (do_have_metadata) {
 // your metadata code here
} else {
 // without metadata code here
}

Other than that, I'd suggest as said multiple times above, focus only on movies for now as the PR is about movies. HDR for the rest of the game shall be figured out on another PR. You're making this issue and the PR way difficult to be reviewed as you're throwing A LOT of stuff at one single shot, I'm not a ChatGPT bot that I have billions of parameters at hand and can evaluate all of them in a fraction of seconds :P So overall, use the KISS principle if possible :)

CosmosXIII commented 1 year ago

@ChthonVII One question about the NTSCJ-to-sRGB conversion. If I understand correctly you are trying to do this on the fly on the shader. But if you can do that wouldn’t it be also possible to do that on an external tool? For movies I think you would need to to re-encode so it may loose quality but other textures are in lossless formats like png. I’m saying this because expecting people to work on a Japanese format from the 90s is really not ideal (you can’t even visualize it correctly in external video/image tools)

ChthonVII commented 1 year ago

@CosmosXIII

But if you can do that wouldn’t it be also possible to do that on an external tool?

Oh, I already made a tool for that. And I plan to convert SYW's upscaled field textures as a short-term solution. But 3 problems remain:

  1. Not doing it in the shader leaves the vanilla game permanently stuck with the wrong gamut. Maybe it's time to accept that no one plays FF7 with any of the original textures anymore, but what about FF8?
  2. It doesn't address color information coming in v_color0. Sooner or later, that will have to be handled in the shader. (Also, I'm still not sure whether the v_color0 multipliers in the isTexture case should be properly considered "colors" for linearization and gamut.)
  3. HDR. Oy. HDR has the potential to look much better with NTSCJ->rec2020 than with NTSCJ->sRGB->rec2020. But that would require either doing it in the shader, or pre-baking two sets of textures for every mod.

I’m saying this because expecting people to work on a Japanese format from the 90s is really not ideal.

I think you perhaps misunderstand. What I want to eventually have is texture gamut metadata. The modder can flag a texture as NTSCJ (if it's an upscale of an original texture) or sRGB (if its made from scratch in Photoshop/GIMP/etc.) and then the shader will apply the correct gamut conversion if one needed. The only place where people would have to work in NTSCJ would be if they're modifying the raw values that come in through v_color0 (for example, when ESUI themes change the text box colors).

Gamut for movies is basically solved. They have gamut metadata, and you can edit it without re-encoding (and I added instructions for how to do that to the documentation). We just do what the metadata says, or interpret blank metadata as NTSCJ, in most cases.

To go back expand on problem 3 a bit: Gamut conversions can only be perfect when the white point is the same and the source gamut's red/green/blue points all lie within the target gamut. When the white point is different, you have to scale distances from the white point in CIE XYZ perceptual space. This is inherently imperfect because CIE XYZ space isn't a perfect model of human vision. When the source gamut's vertices lie outside the target gamut, reproducing those colors is simply impossible, and you must choose between clipping (at the price of terrible banding) or scaling (at the price of overall accuracy). The Bradford algorithm is the "state of the art" solution for doing as good as we know how with scaling. Now, as for NTSC-J to sRGB or rec2020: The white point is different in either case, so there's no getting around that. But the situation with the vertices is different. All 3 of NTSCJ's vertices (same as NTSC1953's) are outside sRGB, but they are inside rec2020. (See chart.) So a great deal more accuracy could be achieved with a direct NTSCJ->rec2020 conversion than with indirect NTSCJ->sRGB->rec2020.

rgb-color-space-gamut@2x

ChthonVII commented 1 year ago

I still need to do a few things on the documentation and PR summary and such, but I think the HDR code is finished and ready for testing by someone with a HDR monitor.

A few notes:

  1. I'm most worried about the SDR-white-level auto-detection code, since this was a bunch of awful Windows API stuff, and I'm only able to test the fail state. It should report success or failure in FFNx.log. If it fails, enabling trace_renderer should log which API function failed with which error code.
  2. Code for gamut conversions for textures is in place, using the same uniform used to pass movie gamut metadata. Since this uniform defaults to sRGB, it presently does nothing for SDR and only does sRGB->rec2020 conversion for HDR.
  3. I moved the gamut conversions as early as possible. This should make NTSC-J movies look better in HDR because a wider swath of the rec2020 gamut will be used. Later if/when texture metadata is supported, we should get the same benefit there too.
  4. I think I correctly adjusted the lighting code to account for moving the sRGB->rec2020 conversion ahead of it. Though I'd be much obliged to CosmosXIII for a double-check.
  5. I'm interpreting v_color0 as NTSC-J when it's being used in the non-texture case for directly rendering a polygon with a solid color or gradient. Mods that alter these colors via hext files are going to see a color shift. For example, the text box backgrounds in Finishing Touch. To make it easy to undo these color shifts, I've created a tool that will tell you the NTSC-J color that converts to a target sRGB color.

[edit: darn it! I missed something in lighting.] [further edit: should be fixed now]

julianxhokaxhiu commented 1 year ago

@ChthonVII Thanks, I'll try to give a try to the PR locally here between today and tomorrow and will report back.

Regarding tooling, make sure you provide an easy to use, and ready to download one to modders, as no one of them will build it locally. Consider examples like KimeraCS, CaitSith, Palmer or FF7SND.

Additionally, this means that before merging we'd need to ask modders to come onboard in case and test all their mods before as this would mean a breaking change. Not an easy one to bring in, we have to see how well will they receive it.

In any case, consider the idea that we may be asked to add an option for this entire conversion ( at least on the game ) to not happen by default, but more like an opt-in. Not sure how feasible it is, but let's be open to this scenario.

ChthonVII commented 1 year ago

I think I can make a Windows binary available for that tool.

I'm pretty hostile to adding a toggle for this.

First, it's a relatively small color shift to a very narrow functionality. Most mods don't mess with hardcoded colors. Finishing Touch is the only one I'm aware of that's impacted. And I gave them a tool to get back to their original sRGB colors if they want.

Second, this change needs to happen sooner or later for simply getting the vanilla game correct. So it might as well be sooner. The longer we wait, the greater the chance that existing mods impacted by the change will become unmaintained or new mods impacted by the change will get made.

ChthonVII commented 1 year ago

My plans for doing a baked-in gamut conversion on field textures hit a brick wall with animated textures. It seems like only the very brightest parts of the waterfall animations are displaying, while everything else seems static. I'm wondering if I've got files in the wrong format, or if the animation process is sensitive to the input textures' colors in a breakable way.

[Edit: Looks like maybe I've got the format wrong. ??? I can mix-and-match SYW's animations on top of my static base image, and things animated correctly at least, but the other way around the animation is broken. It looks like my files are RGBA8 BC7_UNORM dds with fully-transparent black background and fully solid colored parts. What are they supposed to be?]

[Further edit: Wait a minute... Are these darned things paletted? With a hardcoded palette?]

julianxhokaxhiu commented 1 year ago

Here you can some HDR screenshots of your code running on top of SYW V6 FMV 30FPS pack. You'll find them zipped as I used the HDR screenshot functionality ( https://support.microsoft.com/en-us/windows/keyboard-shortcut-for-print-screen-601210c0-b3a9-7b58-bc40-bae4dcf5f108 ) which is the only one supported in Windows to get correct screenshots. All the other apps and shortcuts will bring you a washed out file.

Overall I have to say I'm impressed as the color rendering looks really vivid and "right". So we're definitely on the right direction :)

So what's left from my own PoV is:

Once both these actions will be done, I'll start another final code review and then we can finally merge. PLEASE make sure you have no left-overs when I'll start that process as we'll focus only on the code I see.

Final notes: I saw you added the HDR color conversion for the game, even though we did not agree on doing this part so please for now put it aside and focus only on movie color. Whatever is game related let's work it on another PR. If not really doable please explain me the PROs and the CONs of putting it aside. Thank you in advance and well done!


[Further edit: Wait a minute... Are these darned things paletted? With a hardcoded palette?]

Yes, animated textures is a technique I implemented to overcome the paletted logic the game uses by default. Regarding color information for the DDS files I think @satsukiyatoshi can answer you better, I have no idea honestly.

ChthonVII commented 1 year ago

In order to get the benefit of HDR's wide gamut (which is responsible for much of the vividness), I had to move the conversion to rec2020 much earlier.

Since I was doing that anyway, I filled out those stanzas with the logic for selecting other possible gamut conversions. But none of them do anything right now because FSMovieFlags.y always has the default value (sRGB) for non-movie content. Except the conversion for directly rendering hard-coded colors via v_color0. I suppose I can just comment that one out for the time being.

However, the animated textures are broken. My best guess as to what's going on is that pixels in the replacement textures are skipped if their color isn't in the palette. So that gets broken when we convert the replacement texture to rec2020 and it no longer matches the palette. (Since I don't have a HDR monitor, I'm testing this on my end by substituting a NTSC-J-to-sRGB conversion for the sRGB-to-rec2020 conversion. But I'd be stunned if one worked correctly while the other didn't.)

Possible courses of action:

  1. Move the conversion to rec2020 back to post.
    • Pros: Animated textures will work again.
    • Cons: We lose the benefit of HDR's wide gamut across the board. My goal of eventually doing gamut-aware texture rending is probably dead.
  2. Fix the animated texture stuff so that it works regardless of the color of the replacement texture.
    • Pros: Keep the benefit of HDR's wide gamut. We probably fix incidental dead pixels in upscaled animations where interpolation took things outside the palette.
    • Cons: I don't have the faintest clue how to do this. I'd have to ask for you to do this.

Another thing I might have broken is lighting in HDR. I think I correctly adjusted everything to account for lightning now coming after the conversion to rec2020, but I can't test that myself.

I feel very badly that this PR still isn't done.

ChthonVII commented 1 year ago

I've been thinking about how to work around the animated texture problems if they can't be fixed, and I have a truly wild idea: How about doing all the gamut stuff backwards?

It would look something like this:

I am debating with myself whether this would be a better direction even if the problems with animated textures turn out to be easily fixed. In theory, this would be a way to sidestep all the unknown things that could go wrong with immediately shifting the colors.

CosmosXIII commented 1 year ago

@ChthonVII The PR is still in draft mode so I was not able to comment there but I wanted to ask about this REC709toREC2020 function. So usually to do lighting calculations correctly, colors need to be converted to linear space. For this I used the toLinear function, which converts from sRGB to linear (textures are converted automatically by the GPU). My understanding of color spaces is limited but after applying the REC709toREC2020 function are the values still linear? I understand REC2020 is the color space to which we need to convert to when using a HDR monitor but wouldn't that need to be done last after lighting calculations are done? Basically what I want to say is that lighting calculations should be done independently of what the monitor can display in units like lumen, candela and so on, then apply a tonemap function and then do the final conversion to whatever color space is required to display in a monitor/TV (there is no tonemap in FFNx yet).

ChthonVII commented 1 year ago

@CosmosXIII The term "color space" usually refers to a system for mapping light to numbers. RGB, YUV, XYZ, LAB, etc. are all color spaces.

A "color gamut" is a contiguous subset of the human-visible colors that a particular display device is capable of emitting. Color gamut is only meaningful in the XYZ colorspace. (More on this in a minute.) A gamut conversion, GamutAtoGamutB(jkf), means "When DisplayDeviceA receives XYZ input of jfk, it emits light of color L. Return whatever XYZ value that causes DisplayDeviceB to also emit light of color L (or, if that's impossible, the nearest DisplayDeviceB can manage)."

A gamma function pulls the darker colors up brighter. Gamma is only meaningful in RGB colorspace. This serves 2 purposes: (1) SDR devices expect gamma-encoded RGB input. CRTs have a non-linear voltage-to-brightness response that's (roughly) the inverse of the function. And LCDs mimic that. (2) Storing images gamma encoded allows us to spend more bits on the dark tones that human vision is better at distinguishing. The inverse gamma function reverses this and goes back to linear RGB. What about HDR monitors? They too expect gamma encoded input, just with a different gamma function -- ApplyREC2084Curve().

after applying the REC709toREC2020 function are the values still linear?

Short answer: Yes.
Long answer: What this the REC709toREC2020() function is doing is converting linear RGB to XYZ using the sRGB(= bt709) red/green/blue points, then doing a gamut conversion (as described above) from the sRGB gamut to the rec2020 gamut, then converting from XYZ back to linear RGB using the req2020 red/green/blue points. All of that has been precomputed into one matrix multiplication operation (to avoid thousands of redundant operations per frame).
Aside: It's also the wrong function to be using if your input isn't in the sRGB gamut to start with.

wouldn't that need to be done last after lighting calculations are done?

Short answer: No.
Long answer: Gamut conversions are, in theory, distributive. GamutAtoGamutB(color1 color2) = GamutAtoGamutB(color1) GamutAtoGamutB(color2). There is a small difference in some cases, arising from gamut conversions being imperfect when the white points are different and/or when the source red/green/blue points lie outside the target gamut. So, my understanding is that values in the lighting system that represent colors need to be gamut converted to the same gamut as the input, but values that represent non-color things (normals, bump maps, depth maps, specular maps, etc.) should not be. So far as I can tell, the only colors used by the lighting system are the direct and ambient light colors and the time-of-day color. So I added conversions for those when HDR is enabled.
Addendum to long answer: And it should be noted that we're presently doing things wrong. For the typical user, a frame contains a mishmash of (1) original/upscaled textures that ought to be converted from the NTSC-J gamut to the sRGB-gamut, but are instead misinterpreted as sRGB-gamut, intermingled with (2) from-scratch mod textures that are already in the sRGB gamut. And then we're doing lighting on top of that mismash using sRGB-gamut lighting colors. To get things right, we need to get everything in the same gamut before lighting is applied. (That said, I am coming around to the idea that it might be better to approach it backwards -- instead of trying to get everything convert to sRGB ASAP, inverse convert the sRGB-gamut mod assets, do everything, then NTSCJ-to-sRGB (or NTSCJ-to-rec2020) convert the final frame in post.)

lighting calculations should be done independently of what the monitor can display in units like lumen, candela and so on

They are. The max nits business is part of the HDR gamma function -- ApplyREC2084Curve() -- not the HDR gamut conversion function -- REC709toREC2020().

and then do the final conversion to whatever color space gamut is required to display in a monitor/TV

Doing a gamut conversion at the end means that some things are going to have to go through 2 conversions. Where we end up with NTSCJ-to-sRGB-to-rec2020, that forfeits the benefit of rec2020's wide gamut and looks less vivid (and less accurate) than it could be.

My current approach of converting everything to the target gamut immediately after it's read in is the best approach in terms of gamut correctness, but it breaks animated textures because of palette crap. And I'm afraid there may be other palette crap lurking around that I don't know about yet.

I'm pondering doing a sRGB-to-NTSCJ conversion on the sRGB-gamut elements only, then doing a NTSCJ-to-something at the end. This would bypass the problems with animated textures, but introduce some clipping errors if the sRGB-gamut elements were very near 0,1,0 or 0,0,1. It would also change how the lighting looked on sRGB-gamut elements, since it would effectively be NTSCJ lighting colors. (Well, it would only impact the time-of-day color since the direct and ambient colors default to white.)

CosmosXIII commented 1 year ago

@ChthonVII Thanks for the clear explanation.

What this the REC709toREC2020() function is doing is converting linear RGB to XYZ using the sRGB(= bt709) red/green/blue points, then doing a gamut conversion (as described above) from the sRGB gamut to the rec2020 gamut, then converting from XYZ back to linear RGB using the req2020 red/green/blue points. All of that has been precomputed into one matrix multiplication operation (to avoid thousands of redundant operations per frame).

Ok, that makes a lot of sense. It would be great if you could write this in a comment (the fact that ultimately it converts back to linear RGB).

So far as I can tell, the only colors used by the lighting system are the direct and ambient light colors and the time-of-day color. So I added conversions for those when HDR is enabled.

Well, there is also the colors from the textures and the vertex colors but you already seem to apply the conversion to those too.

My current approach of converting everything to the target gamut immediately after it's read in is the best approach in terms of gamut correctness, but it breaks animated textures because of palette crap.

That is strange, I thought the animated textures were processed on the CPU before sending the final image to the GPU for rendering. So I'm not sure why these would get affected...but @julianxhokaxhiu probably knows more about this.

julianxhokaxhiu commented 1 year ago

That is strange, I thought the animated textures were processed on the CPU before sending the final image to the GPU for rendering. So I'm not sure why these would get affected...but @julianxhokaxhiu probably knows more about this.

I'm not entirely sure but I think this is where we decide the color format of the palette textures ( and vanilla textures in general ). As of now all the textures are in BGRA format, and this is where we convert palette textures to BGRA.

Regarding processing, the game engine moves literally pixel colors when running the MPPAL opcode, which does the "animation". As a matter of fact, it's just multiplying each color pixel with a given color which every N iterations is bringing eventually back the pixel color to its original state. We calculate the hash of that multiplication and we use it as a basis for uploading then external textures.

Tbh I'm not understanding where the challenge of this animated textures is living: on the vanilla ones or the external ones?

ChthonVII commented 1 year ago

Tbh I'm not understanding where the challenge of this animated textures is living: on the vanilla ones or the external ones?

BOTH! I get the same wrong results when pre-baking a gamut conversion into the textures as when doing a gamut conversion on texture_color in the shader.

Here's a picture: bustedwaterfall

You can see that only a few of the pixels in the waterfall are animating. (Also, please ignore the red tint on Cloud; that's testing if v_color0 is involved, which is isn't.)

A bit more detail: The base texture doesn't seem to matter. It's the animation textures where things go haywire.

My best theory is that pixels from the animation textures get skipped if their color values fail some sort of comparison against a hardcoded value. Maybe the "Start colour" parameter has something to do with it?

Aside from the headaches this is causing me, it's probably causing some dead spots in @satsukiyatoshi's upscales where interpolation produced intermediate color values that are out-of-palette.

julianxhokaxhiu commented 1 year ago

Tbh I'm not sure where you are changing this color, but if you're manipulating the original array of pixel colors of the texture, then you're basically breaking every mod supporting animated textures, as the hash will be different.

If you're doing this on the shader, I don't understand why this is being a problem as you do the conversion only after it has been uploaded. Can you maybe clarify how are you doing this piece of code?

CosmosXIII commented 1 year ago

The only place in the shader that discards colors is this part: https://github.com/julianxhokaxhiu/FFNx/blob/master/misc/FFNx.lighting.frag#L126-L166

But I only see you changing the rgb values, not the alpha so I don't think this should be affected.

Another thing that could be problematic is the saturate calls you have in your conversion functions. Do these conversions result in values > 1? If so the saturate function is clamping the value to 1 and even if the color is animated you may loose the changes in color because it being all clamped to 1.

ChthonVII commented 1 year ago

Sorry to leave everyone hanging. I was away from home for a few days.

Tbh I'm not sure where you are changing this color, but if you're manipulating the original array of pixel colors of the texture, then you're basically breaking every mod supporting animated textures, as the hash will be different.

If you're doing this on the shader, I don't understand why this is being a problem as you do the conversion only after it has been uploaded. Can you maybe clarify how are you doing this piece of code?

I tried both ways. And both ways gave the same broken results. In one attempt, I made no code changes at all and simply batch converted SYW's textures using my converter program. In the second attempt, I used SYW's textures unchanged and did a gamut conversion on them in the shader immediately after sampling. Basically:

...
vec4 texture_color = texture2D(tex_0, v_texcoord0.xy);

if (!(isFBTexture)){
    if (isHDR){
        texture_color.rgb = convertGamut_NTSCJtoREC2020(texture_color.rgb);
    }
    else {
        texture_color.rgb = convertGamut_NTSCJtoSRGB(texture_color.rgb);
    }
}
...

Both attempts resulted in a broken, fragmentarily-animated waterfall that looks like the picture above.

It seems that this hashing business doesn't work the way I thought it did. I believed (incorrectly it seems) that all I needed was the text in the filename to match whatever was extracted from vanilla FF7. Since that's not it, how does it work? The text of the file name needs to match the hash of the previous file's data? Then how on earth am I getting the broken partial animations instead of nothing?

Anywho, what's getting hashed is load_animated_texture()'s data parameter. Walking back to see where that's coming from, I see common_load_texture() appears to be using a BGFX RendererTextureSlot to store it. Is it possible that BGFX is calling the shader here so that XXH3_64bits() is getting a modified buffer?

@CosmosXIII: It can't be the saturate() in the gamut conversion functions, since I also get broken results just changing the texture files themselves. It's also not the saturate() I added to toLinear() and toGamma(), since it's still broken if I take those out. (Also, we'd have a serious problem if that were it, since those functions aren't properly defined for domain or range outside 0-1.)

For a third attempt, I've moved the NTSCJ-to-something gamut conversions (aside from movies) to post, gated behind a conditional that I'll eventually hook up to a config option. This fixes the problems with the animations. However, it comes at the cost that movies using any gamut other than the selected one must undergo two gamut conversions. This will have a small accuracy cost, plus an additional cost for HDR not getting to use the wide gamut as fully as we could (visually a loss of vibrance).

Testing this third attempt made me realize I've been mistaken in how I've been thinking about the gamuts of various asset classes. My prior thinking was this:

ChthonVII commented 1 year ago

@CosmosXIII

About moving the toGamma conversion to the post effect, I already tried this but it changes how the alpha blending or additive blending effects look. While doing blending in linear space would be the correct way to do it, it changes too much how the game looks. Probably the original developers worked with blending in sRGB and adjusted the alpha values accordingly.

If this is the case, shouldn't we change the HDR code to work like this too?

ChthonVII commented 1 year ago

Does anyone recall if FF8 ever draws over the top of videos like FF7 does?

julianxhokaxhiu commented 1 year ago

Does anyone recall if FF8 ever draws over the top of videos like FF7 does?

AFAIK they do, one example is the castle scene when the teams meet Ultimecia sitting on the throne.

It seems that this hashing business doesn't work the way I thought it did. I believed (incorrectly it seems) that all I needed was the text in the filename to match whatever was extracted from vanilla FF7. Since that's not it, how does it work? The text of the file name needs to match the hash of the previous file's data? Then how on earth am I getting the broken partial animations instead of nothing?

Hashes are produced using the manipulated array of pixel color data that the game engine changes using the field opcodes MPPAL. As we can't predict the state of the source and result, the only way to have a "snapshot" is to hash that array of pixel color that is literally the bitmap that is going to be uploaded to the GPU as texture data.

Unless you manipulate that array, hashes will always match the ones also mods like SYW is using. In other words, if you do color changes on the shader, you're safe and there won't be any impact on mods.

shikulja commented 1 year ago

Does anyone recall if FF8 ever draws over the top of videos like FF7 does?

At the start of the game. After the first dialogue in the first-aid post Combat mission in dolet (landing) - and escape from the robot spider At the battle between the gardens (landing) Task on the train After the prison, watch the explosion as the rockets fly into the garden When flying into space and many more where.

CosmosXIII commented 1 year ago

@ChthonVII

It can't be the saturate() in the gamut conversion functions, since I also get broken results just changing the texture files themselves.

Getting broken results changing the texture files make sense, what does not make sense is the shader affecting how the animated textures are processed because as TO explained, animated textures are processed with opcodes on the CPU, what is sent to the shader for rendering is the final texture. So it has to be something you do on the shader.

If this is the case, shouldn't we change the HDR code to work like this too?

Back then I just wanted to do lighting for 3d models in linear space, I did not want it to affect how the 2D backgrounds look so there was no real benefit other than mathematical correctness... In the case of HDR it already changes how the game looks entirely anyway so I'm not sure what is better. For now, if doing it on post works fine then I would argue to do it that way for time being, if you find a better way you can always make another PR later.

ChthonVII commented 1 year ago

@CosmosXIII Ah darn, I'm starting to think you're right. Gamut conversion is likely giving us a set of cyan colors for that waterfall that are all outside the sRGB gamut, so they're all getting clipped to the same color. Going back and looking at the results of doing the conversion in post, there seems to be a little clipping there too.

Fixing this is going to require adding a scaling step to the gamut conversion. And that's an ugly problem. It looks like most "state of the art" algorithms for this were designed with printing in mind, and aren't suitable for animated input. (They'd be prone to inconsistent results on identical parts of similar frames, which would yield flickering.) The remaining options look quite complex and computationally intensive, probably requiring precomputation of a LUT.

Could take some time to hammer out.

ChthonVII commented 1 year ago

Status Update!

Longwinded Summary of the Problem: The most basic gamut conversion is simply to convert from linear RGB to XYZ using one set of primaries, then back again using the other set. Two wrinkles may arise that require more complicated solutions.

One potential wrinkle is that this simple conversion gives the wrong results if the white points are not the same. To deal with that, you need to convert linear RGB to XYZ to rho-beta-gamma cone-response space, do a chromatic adaptation using the Bradford method, then back again. This can all be pre-computed into a single matrix multiplication, and those matrices are already in use in the PR.

The other potential wrinkle is that you may wind up with linear RGB output that's out of bounds, above 1.0 or below 0.0. (I should mention here that this is only a problem for SDR. The rec2020 HDR gamut was purposefully designed to fully enclose all the other gamuts historically in common use.) Outside the world of printing (where this issue is a huge deal), a pretty common approach is "eh, just clip it." This generally works because out-of-bounds colors aren't that common (in photographs), and solving the problem is really fugly. Unfortunately, "just clip it" isn't a suitable approach for FF7, with that darned waterfall as Exhibit A. As CosmosXIII recognized way before me, the problems I was seeing here are out-of-bounds issues. Pretty much the entire detail on the waterfall is in cyan colors outside the sRGB gamut used for SDR monitors, so all that detail just gets lost if you simply clip out-of-bounds colors. It's pretty much the worst possible case. (And I should have been expecting it all along if I weren't so oblivious. Video games aren't photographs, and of course graphic artists use lots of bold saturated colors near the edge of their working gamut.)

Longwinded Summary of the Solution: What we need to solve the out-of-bounds problem is a "gamut (compression) mapping algorithm." In broad strokes, the out-of-bounds colors need to be remapped to a smaller zone just inside edge of the destination gamut, and the near-the-edge colors need to be remapped to a smaller zone further back from the edge to make room for that. So we wind up trading away some colorimetric fidelity in exchange for preserving some of the out-of-bounds detail. There are a number of hurdles to overcome:

  1. Gamut mapping is too complex and compute heavy to do in the shader. I need to write a standalone program to generate LUTs, then make FFNx use those LUTS for gamut conversions in SDR mode.
  2. We need a LCh-like polar colorspace that's perceptually uniform. LCh is relatively straightforward, but it's not perceptually uniform, which can lead to hue shifts in blue colors. In theory, the best option is a newfangled colorspace named Jzazbz (or rather its polar cousin JzCzhz). Jzazbz has proved something of a pain to implement.
  3. We need a way to determine where the gamut boundaries are. This turns out to be a really hard problem. I'm fortunate to be working on this now, since the only good solution was just published in 2022. (This step is where the miserable amount of compute is needed.)
  4. Finally, we need an actual mapping algorithm. Unfortunately, much of the really cool stuff is designed with printing in mind, and isn't suitable for animated content since it won't behave consistently from frame to frame. So I've resorted to a couple of old, less sexy algorithms (GCUSP, chroma-only mapping) and some minor tweaks borrowed from newer stuff. (Though so far it doesn't seem like the tweaks help much.)

Presently, the standalone LUT generator is nearing completion. It's done enough that it can process images, and do a passable job on a screenshot of that darned waterfall. (I'll edit this to post some a sample tomorrow.) Once the LUT generator is done, I'll work on making FFNx use LUTS for gamut conversions in SDR mode.

Edit: Here are some pictures. Picture#1 is a sRGB screenshot. Picture#2 is that same shot run through a NTSCJ-to-sRGB gamut conversion with Bradford chromatic adaptation and out-of-bounds colors clipped. (This should be equivalent to the current behavior of my branch.) A few things to note about Picture#2:

Picture#1 srgbwaterfall

Picture#2 wfcliptest

Picture#3 gma

Picture#4 problemzone

Picture#5 rawgcuspsoftbigknee

julianxhokaxhiu commented 1 year ago

This status update screams pain all over it, but also knowledge and experience. Looking forward to see the LUT part how it will look like.

Although I'd invite you now, since the topic is becoming way more complex than just fixing colors in movies, to start chunking the PRs. I'll close therefore #534 in favor of multiple smaller PRs. Chunk them by "layer", so try to first fix movies, then try to fix normal content and finally animated textures. Otherwise it will be impossible for me to review.

Keep going, looking forward to the final solution :) Thanks in advance!

ChthonVII commented 1 year ago

It's all muddled up together. Any intermediate PR is going to leave something in a half-broken state. In any event, I expect there should only be 2 more commits.

ChthonVII commented 1 year ago

Opinions please. The following images use two different gamut mapping algorithms. Which one looks better? (Miraculously, they are pixel-identical except for the flowers that are within the blue light beams. So please concentrate your attention there.)

PictureV: vpbig

PictureG: gma

eve-atum commented 1 year ago

I did the toggling two tabs side-by-side thing on my desktop (GTX 960 driving two Asus VN247), laptop (Intel UHD 620, driving both the built-in screen and an Acer G195W via HDMI->DVI), and iPhone SE2, and couldn't see any difference between the two, even in the blue-beamed flowers. Perhaps Git did some kind of compression?

ChthonVII commented 1 year ago

@CosmosXIII Would color values less than 0 or greater than 1 break the lighting stuff? (I could do some things better for HDR than SDR if I could let out-of-bounds values ride until the post processing fragment shader.)

@eve-atum Look at pixel 793, 850 in the left-most petal of the left-most flower on the house's lower balcony. In PictureG it's 183, 250, 128, but in PictureV it's 179, 254, 114. At least on this bunch of flowers, I think I'm liking V better. It's more saturated and slightly better contrast. (Aside from that, I'm still in shock that both algorithms somehow wound up at pixel-for-pixel identical results for the waterfall.)

@all LUT function is implemented and working. Looks better than expected! Still need to set up LUTs for all the possible movie gamut inputs and outputs, and maybe get clearance from CosmosXIII to permit out-of-bounds values.

ChthonVII commented 1 year ago

Status Update:

It turned out that my gamut program had an embarrassingly stupid copy/paste bug that caused the parameter for changing the mapping algorithm to change a different parameter instead. Doh! So that's why those two images came out so similar... It also meant that a couple hundred LOC that I thought were tested and working correctly had in fact never run and were full of bugs. Anyway, that's all fixed now, and now I have a real image comparison to offer!

(Aside: This site is useful for comparing images.)

To keep this a little shorter, I'm not going to repost the sRGB screenshot or the clipped gamut conversion, both of which you can see a few posts up. What I'm going to post here are 3 algorithms with 2 parameter sets each, for a total of 6 images. Then some discussion below.

Image1: CUSP, paramsA cusp90

Image2: CUSP, paramsB cusp4090

Image3: HLPCM, paramsA hlpcm90

Image4: HLPCM, paramsB hlpcm4090

Image5: VP, paramsA vp90

Image6: VP, paramsB vp4090

My impressions are:

So, please take a look and let me know which looks the best to you!

Other news:

ChthonVII commented 1 year ago

After giving up on fixing the theoretical flaws in VP last night, this morning a solution occurred to me. By sheer coincidence, it looks like the "dark parts of the waterfall are a bit too dark" issue may have been a manifestation of one of these flaws, since the dark parts of the waterfall are less dark in my new algorithm "VPR." There's obviously less contrast in the waterfall, but all the details VP preserved seem to be visible in VPR too. And VPR is definitely preserving some details the HLPCM isn't. Kinda feeling like I hit the jackpot here. So here's image sfor VPR with both sets of parameters. Interestingly, they seem to make a bigger difference here than with any other candidate. Not sure which I like better.

Image7: VPR, paramsA vpr90

Image8: VPR, paramsB vpr4090

ChthonVII commented 1 year ago

I think I'm done. I'll put in the PR tomorrow after writing up a decent summary for the PR.

eve-atum commented 1 year ago

@ChthonVII Sorry, mate, but I'm just not seeing any difference between the two using the desktop hardware I mentioned earlier. I even zoomed in on that flower petal you mentioned earlier just to be sure. Not saying it isn't there, just that my hardware/wetware combination is either too far out-of-spec, too old, or maybe a little of both.

ChthonVII commented 1 year ago

In case anyone wants a preview of the new movie handling and NTSC-J mode:

  1. Back up your FFNx files.
  2. Grab this preview build and overwrite your existing files.
  3. Make sure that FFNx.toml includes enable_ntscj_gamut_mode = true

Some notes on mod compatibility:

CosmosXIII commented 1 year ago

@ChthonVII Sorry for the late reply.

Would color values less than 0 or greater than 1 break the lighting stuff? (I could do some things better for HDR than SDR if I could let out-of-bounds values ride until the post processing fragment shader.)

To me that sounds problematic. Having color values less than 0 would mean having negative luminance in the diffuse term. Then having colors greater than 1 would be equivalent to having the surface emit light. So from an energy conservation point of view it does not sound great.

julianxhokaxhiu commented 1 year ago

As we speak about color correction, I just want to leave this link here, might be useful to everyone tuned in on the topic: https://jlongster.com/color-space-experiments

ChthonVII commented 1 year ago

@ChthonVII Sorry for the late reply.

Would color values less than 0 or greater than 1 break the lighting stuff? (I could do some things better for HDR than SDR if I could let out-of-bounds values ride until the post processing fragment shader.)

To me that sounds problematic. Having color values less than 0 would mean having negative luminance in the diffuse term. Then having colors greater than 1 would be equivalent to having the surface emit light. So from an energy conservation point of view it does not sound great.

Thank you! I was afraid that was the case, but it's good to hear an authoritative answer.