godotengine / godot

Godot Engine – Multi-platform 2D and 3D game engine
https://godotengine.org
MIT License
86.37k stars 19.24k forks source link

Drawing linear colors to a HDR2D Viewport is not possible #93206

Open Motioneer opened 2 weeks ago

Motioneer commented 2 weeks ago

Tested versions

Tested in 4.3 dev and 4.2,1 (stable)

System information

v4.3.beta.custom_build [71699e08c]

Issue description

When drawing in a HDR2D SubViewport, a sRGB-to-linear colorspace conversion happens, regardless of the source's colorspace. Shaders also receive a wrongly corrected sampler.

See MRP:

grafik

Manually applying a Linear to sRGB transformation in a shader corrects the color.

Not in the MRP, but also tested:

Steps to reproduce

  1. Create SubViewport
  2. set HDR2D enabled
  3. draw HDR source in said viewport

or

  1. Open MRP

Minimal reproduction project (MRP)

srgb_linear_test.zip

Motioneer commented 2 weeks ago

Some updates:

The visual output of the darkened color is actually correct. It is simply the linear color displayed in sRGB space without correction applied.

I don't know if that is a wrong assumption, but I expected a shader, operating on a canvas item in a HDR2D SubViewport, to still sample a neutral grey color of rgb(0.5,0.5,0.5) as rgb(0.5,0.5,0.5). This is what the in-shader-sampler checks for.

While right now it receives a color that has seemingly gone through a sRGB to linear conversion of rgb(0.214, 0.214, 0.214).

My current theory is that this is actually happening because the ColorRect converts its color to linear when it's drawn. I assumed that setting it's color would set a linear color, instead of an sRGB color that then gets converted.

This apparently also means, that anything drawn in a _draw function gets implicitly converted from sRGB to linear. Which was not at all what I expected.

Motioneer commented 2 weeks ago

Thank you akien, I was struggling with the spelling. ;)

Calinou commented 2 weeks ago
Motioneer commented 2 weeks ago

Since this has been labelled as a documentation issue, what is the expected workflow for drawing canvas items that are in linear colorspace? For items with a color parameter, converting the color in code to sRGB so that it can be converted back to linear?

color.linear_to_srgb()

For textures, applying a conversion shader would work. But it does seem a bit of a roundabout solution. 😅

clayjohn commented 2 weeks ago

@Motioneer When using HDR2D, rendering happens in linear space. No conversion is needed from your end, Godot will automatically adjust so that everything is in linear space. Note, Godot isn't converting your scene to linear space, it just renders in linear space. The docs currently have this to say:

Additionally, 2D rendering will take place in linear color space and will be converted to sRGB space immediately before blitting to the screen (if the Viewport is attached to the screen). Practically speaking, this means that the end result of the Viewport will not be clamped into the 0-1 range and can be used in 3D rendering without color space adjustments.

If you want to sample that Viewport without displaying it to the screen, and you need the viewport to be in sRGB space, you are responsible for tonemapping the viewport yourself.

I'm not sure what your use-case is. But it might help to understand the reason we have the hdr_2d mode. It was created to allow:

  1. Using 2D viewports in 3D rendering with full precision (for example, to allows users to use Glow in 2D)
  2. Allowing Godot to eventually support HDR monitors

In both cases, it is absolutely necessary to render in linear space and avoid any color space conversions until the Viewport is blitted to the screen. If you have a use-case where you require a higher dynamic range, but you don't want to the final result to be in the linear color space, you have to tonemap yourself. Either in a fullscreen shader pass in the Viewport, or when sampling the viewport.

Motioneer commented 2 weeks ago

@clayjohn Thank you for responding!

sRGB is not something I want in this case, since I don't actually draw the HDR SubViewport to screen. Instead it is an input to a smoke simulation:

Godot_v4 2 1-stable_win64_IneYsx9wLm-ezgif com-resize

The simulation is a fragment shader that offsets the UV based of the "wind speed" input, which is a HDR SubViewport. (Visible in the second part of the gif.)

I map the x, y (-1.0 .. 1.0) coordinates of the wind direction to red and green channels (0.0 .. 1.0), when drawing to the SubViewport and reverse the mapping in the simulation shader.

Here is the relevant code snipped for the color calculation of the "wind lines", for clarity:

var direction: Vector2 = start_pos.direction_to(end_pos)

# Modify from -1 _ 0 space to 0 _ 1
var scaled_direction:Vector2 = direction * 0.5
var moved_direction:Vector2 = scaled_direction + Vector2(0.5, 0.5)

var directional_color := Color()
directional_color.r = moved_direction.x
directional_color.g = moved_direction.y

var srgb_color = directional_color.linear_to_srgb()

I originally assumed, since I was drawing in an HDR SubViewport, I would be drawing linear colors and could directly map the coordinates to the channels, but without the sRGB transformation, the coordinates would be thrown off when not 0.0 or 1.0. So the last line with the linear to sRGB conversion was necessary.

Please let me know if this explanation was coherent!

clayjohn commented 2 weeks ago

@Motioneer Thank you for the detailed explanation! The problem in this case comes from you encoding a directional vector in a color.

In Godot all color inputs to 2D shaders are in sRGB space (I.e. colors, textures, etc.). For textures you can specify with the uniform whether the texture contains sRGB color data (and may need to be converted) or whether it contains raw data. Uniforms, we use the input type to determine if the input is a color (and may need to be converted) or whether it is raw data. Vectors are treated as raw data while Colors are treated as colors.

In your case, you should be using either a Vector2 or a Vector4 to pass in your directional data. Unless I am missing something and it is important for you to be storing a raw direction in a Color variable?

Motioneer commented 2 weeks ago

@clayjohn I am not sure I follow. The reason I am using a SubViewport as a "wind texture" is that i can use the GPU to draw lines in it, enabling a per-pixel modification of the simulation shader.

I don't know if it is visible in the gif, but the SubViewport never clears, blends old lines together, fades to grey and will itself have a shader that modifies it's content over time. This gives me a really cheap way to simulate complex airflow.

I am not quite sure how that would work with passing raw vectors?

clayjohn commented 2 weeks ago

@Motioneer I mean, instead of passing directional_color as a uniform to the shader, pass moved_direction directly instead so that the shader knows that you are passing raw data and not a color

Motioneer commented 2 weeks ago

@clayjohn
For clarification, the simulation shader draws in a "smoke" HDR SubViewport of its own, where it reads the previous frame and modifies it on a per-pixel basis using the "wind texture" pixel data, which is passed by a uniform:

uniform sampler2D wind_texture: filter_linear, source_color;

Are you proposing I pass all "wind lines", as I have dubbed them, as a vector array and then figure out which pixels to modify and in what way, in the shader?

uniform vec2[10] wind_lines;

Sorry, I not sure what you mean with passing raw data. 😅

The moved_direction itself is only part of the information I need, since I actually need all information nessesary to know what pixels need to be affected by this "line". So start and end position as well as width.

clayjohn commented 2 weeks ago

Where is the wind_texture coming from? Is it your Viewport with HDR_2D enabled? Note that source_color tells the shader compiler that the source of the shader is sRGB color data. If you are writing linear data to a texture and then reading it with source_color, you won't get accurate results.

Are you proposing I pass all "wind lines", as I have dubbed them, as a vector array and then figure out which pixels to modify and in what way, in the shader?

I have no idea what "wind lines" are in the context of your application. I am suggesting using moved_direction instead of directional_color if moved_direction is ultimately a directional vector that you want to pass in as a uniform.

Without seeing your code, I can't really provide too direct of feedback. But 2 things to consider:

  1. If a uniform is passed to the shader and the source is a Color, then Godot will transform the color from sRGB space into linear space. So when you are dealing with directions (and not sRGB color) you should use a Vector4 uniform instead of a Color uniform.
  2. If the values are passed to the shader by encoding in a texture, you should avoid using source_color unless the underlying data is actually sRGB color data. If the data encoded in the texture is not color data, avoid using source_color.
Motioneer commented 2 weeks ago

@clayjohn Maybe I should have done this from the start. Here is a simplified version of my setup:

issue_explanation.zip

image

Setup Explanation

I have fully commented the code, to make it as easy as possible to follow.

line_drawer.gd When running the default scene, lines will be drawn following your mouse in the right ("wind") SubViewport. These are the "wind lines" I was referring to.

The direction of the line is encoded in the color. The color is converted from linear to sRGB or used directly, based on the "Convert Line and Clear Color to sRGB" checkbox.

clear_color.gd Draws a semi transparent clear color each frame. For the "smoke" SubViewport the clear color is black. For the "wind" SubViewport it is neutral grey, rgb(0.5,0.5,0.5). The color is drawn with or without linear to sRGB conversion, based on the checkbox.

smoke_simulation_example.gdshader This shader is attached to the "SmokeSimulationShader" Node in the "smoke" SubViewport. It samples the last frame of the "smoke" SubViewport as well as the "wind" ViewportTexture.

The sample UV of the last "smoke" frame is offset by the decoded "wind" direction of the "wind" ViewportTexture,

Please note that the source_color hint for the "wind" ViewportTexture is not present, and does not change the color when added. Please swap the commented lines in the shader to test this.

uniform sampler2D wind_texture: filter_linear;
// uniform sampler2D wind_texture: filter_linear, source_color;

Testing

Disabling the "Convert Line and Clear Color to sRGB" checkbox will break the direction encoding. The clear color is to dark, causing the whole simulation to shift to the left and up. The color of the "wind lines" are no longer representing the correct direction.

image

Disabling HDR on the SubViewports as well, fixes the direction encoding:

image

Conclusion

When using HDR SubViewports, and reading the ViewportTexture as a shader uniform, the any color drawn to the SubViewport needs to be converted from linear to sRGB to be read as linear in the shader.

The source_color hint for the ViewportTexture has no effect on this.

This leads me to believe that somewhere in this pipeline, a sRGB to linear conversion happens implicitly, and I don't know where.

Motioneer commented 2 weeks ago

I think I can boil the issue down to this:

  1. In a HDR2D Viewport
  2. any color drawn
  3. will have gone through a sRGB to linear conversion.

There is no way to avoid this conversion.

It is apparently not possible to draw linear colors directly.

Recording2024-06-19132019-ezgif com-video-to-gif-converter

linear_sampling_example.zip

The shader on the right directly maps the color channel value to the coloring of the UV.

if( 1.0 - UV.y < color.r){
   COLOR.r = 1.0;
}
bs-mwoerner commented 2 weeks ago

I think what clayjohn meant was: Don't pass in your parameter as a "color" (a tuple of values that need to be converted between color spaces) but as a generic vector (a tuple of values that I would like to get as-is).

In your shader:

shader_type canvas_item;
uniform vec4 color;

void fragment() {
    //vec3 color = COLOR.rgb;
    [...]

In your script:

        [...]
        #color_rect.color = color
    color_rect.material.set_shader_parameter("color", color)
        [...]
Motioneer commented 2 weeks ago

@bs-mwoerner Thank you for your comment, but I am not setting a color. If you open up the project in my previous comment, it is just a CanvasItem shader on a ColorRect.

bs-mwoerner commented 2 weeks ago

I was referring to this line in your script

color_rect.color = color

and this line in your shader:

vec3 color = COLOR.rgb;

For the purpose of Godot shaders, this would be considered setting and reading a color.

Motioneer commented 2 weeks ago

@bs-mwoerner I see, thank you for clarifying.

The reason why I don't pass a generic vector is, that in my use case, I don't need a single vector or color, but want to sample a ViewportTexture. The previous example was only to illustrate the color conversion happening.

In this example, you can see the ViewportTexture beeing sampled.

bs-mwoerner commented 2 weeks ago

For reading from a texture instead of a single value, the syntax would change to

shader_type canvas_item;
uniform sampler2D input;

void fragment() {
        vec3 color = texture(input, UV).rgb;

and

color_rect.material.set_shader_parameter("input", tex)

With tex being some variable that holds the texture in question.

Motioneer commented 2 weeks ago

@bs-mwoerner Thank you, this is equivalent to what I used in the linked example.

uniform sampler2D wind_texture: filter_linear;
// source_color does not change the sampling:
// uniform sampler2D wind_texture: filter_linear, source_color;

The problem is, that I can't draw a linear color to a HDR2D Viewport and then sample the ViewportTexture to get the same linear color again.

clayjohn commented 2 weeks ago

I took a look at the new MRP and I can see why this is causing so much frustration. @bs-mwoerner and I have been talking about uniforms and texture sampling, but the MRP doesn't have a problem with uniforms and texture sampling.

The problem comes from the fact that you are passing data in the color parameter of draw_line. Like Color uniforms, Godot assumes that colors are in sRGB space and so it does an sRGB to linear conversion before passing that color value in to the shader to ensure that rendering happens in linear space.

This situation is extra complex because your chosen approach for this effect requires you to use colors to pass in non-color data.

Let's go through the steps one by one:

  1. Draw line with wind direction data (wind direction needs to be encoded in sRGB if passing as a color)
  2. Sample wind direction texture (texture is in linear space, so do not use source_color which will do a sRGB -> linear conversion)
  3. Draw into final texture (if new texture using HDR2D, then will still be linear data)
  4. Draw final texture to screen (if drawing to a non HDR2D Viewport, then need to manually convert to sRGB, if drawing directly to screen, Godot will handle the linear -> sRGB conversion)
bs-mwoerner commented 2 weeks ago

Yes, I got a bit side-tracked by the slider example and the actual problem is indeed not in how the texture is read but rather in how it is filled. It's not a bad idea in principle to use the drawing functions for creating non-color textures (because otherwise you'd have to implement your own line drawing), but since they have color conversion built in, one needs to give them a "fake" color that they convert to the intended numerical value. ("Just don't use a color" doesn't work here, because there's no draw_line_data() function that takes a generic data vector instead of a color value.)

The crucial bit of information (that I felt was true from experience but wasn't able to find a documentation quote for) is that the color parameter for draw_ functions is always interpreted as sRGB and that they will do an sRGB -> Linear conversion when drawing to an HDR target. This can be preemptively undone by calling linear_to_srgb() on the input values before passing them to the function.

Motioneer commented 2 weeks ago

Thank you all for your patience in understanding this issue.

To paraphrase my original question, what is the intended workflow when using draw_line? To do the manual linear to sRGB conversion described?

What is the intended way to use draw_texture in this case? To use a shader on the canvas item that converts the texture while drawing?

clayjohn commented 2 weeks ago

To paraphrase my original question, what is the intended workflow when using draw_line?

The color parameter of draw_line takes a color in sRGB space. If you want to hijack color / COLOR for non-color data, you need to ensure that you encode the data in sRGB space.

What is the intended way to use draw_texture in this case?

I'm not sure what you mean as we haven't discussed a case for draw_texture above. But the same sort of caveats apply. If you are taking linear data and attempting to read it as if it were sRGB you will get problems. The texture parameter of draw_texture takes a color texture which has been encoded in sRGB space. If you pass a texture that is encoded in linear space, then you will have to manually handle the conversion yourself in the shader that reads from that texture.

Motioneer commented 1 week ago

Thank you for clearing that up! I am in so deep in the madness of my project, that I was very confused what about my problem was confusing to others. ;)

In the documentation the whole sRGB / Linear blending and color conversion topic is currently only outlined in the Viewport class reference in the use_hdr_2d property description:

If true, 2D rendering will use an high dynamic range (HDR) format framebuffer matching the bit depth of the 3D framebuffer. When using the Forward+ renderer this will be a RGBA16 framebuffer, while when using the Mobile renderer it will be a RGB10_A2 framebuffer. Additionally, 2D rendering will take place in linear color space and will be converted to sRGB space immediately before blitting to the screen (if the Viewport is attached to the screen). Practically speaking, this means that the end result of the Viewport will not be clamped into the 0-1 range and can be used in 3D rendering without color space adjustments. This allows 2D rendering to take advantage of effects requiring high dynamic range (e.g. 2D glow) as well as substantially improves the appearance of effects requiring highly detailed gradients.

Note: This setting will have no effect when using the GL Compatibility renderer as the GL Compatibility renderer always renders in low dynamic range for performance reasons.

Should we expand on this, or will it bloat the description?

Alternatively, the "Using Viewports" page, in the section about "Rendering", could include an explanation of HDR2D in general, as well as the color conversion. Maybe even move most of the info in the property description to this section instead?

I plan to write a pull request, but could someone more experienced with the documentation give me a pointer which approach would be preferable?