gfx-rs / wgpu-native

Native WebGPU implementation based on wgpu-core
Apache License 2.0
843 stars 96 forks source link

Washed out texture colors #386

Open TheOnlySilverClaw opened 3 months ago

TheOnlySilverClaw commented 3 months ago

I'm currently trying to use wgpu-native as a rendering backend with V, so I hope this is the right place to ask about this.

I've managed to get the pipeline setup to work and loaded a texture, but now the colors look wrong. I'm not doing any lighting or effects, yet.

Left side is my texture, right side is how it's rendered:

Screenshot from 2024-05-19 09-21-06

I first suspected the PNG color values to be wrong, but those look fine to me.

Here's my render code. Not Rust, but should be similar enough to get the picture. https://gitlab.com/valdala/valdala/-/blob/6be4385efd7bd3470010355f594345307ace6b8e/application/graphics/renderer.v

Here's the shader: https://gitlab.com/valdala/valdala/-/blob/6be4385efd7bd3470010355f594345307ace6b8e/application/shaders/textured.wgsl

Here's most of the pipeline setup: https://gitlab.com/valdala/valdala/-/blob/6be4385efd7bd3470010355f594345307ace6b8e/application/webgpu/device.v

The sampler type is set to Filtering and the texture format to RGBA8Unorm. The pixel data is provided in four 8-bit unsigned integers, red, green, blue and alpha from 0 to 255. As far as I understand, that should work?

TheOnlySilverClaw commented 3 months ago

Kinda figured it out.

wgpuSurfaceGetPreferredFormat returns bgra8unormsrgb

wgpuSurfaceGetCapabilities returns bgra8unormsrgb and bgra8unorm

Hardcoding the surface configuration to bgra8unorm fixes the color issue. At least on my device.

So as far as I understand, while I don't have gamma correction implemented, I should pick a non-srgb format? Is there a utility to map those or would I have to write that myself?

almarklein commented 3 months ago

This is related to linear / physical colorspace vs sRGB. It's common that in shaders you work with linear (i.e. physical) colors, so that things like interpolation (over faces) and lighting calculations are correct. But the value you want to put on your monitor you want to be in sRGB.

This is a relatively complex topic. I wrote a post about this when I dove into this for our render engine pygfx. I'll try to explain it briefly for your use-case below.

You can programmatically convert using the following Python code (somewhat verbose for clarity):

def srgb2physical(c):
     if c <= 0.04045:
        return c / 12.92
     else:
        return ((c + 0.055) / 1.055) ** 2.4

def physical2srgb(c):
    if c <= 0.0031308:
        return c * 12.92
    else:
        return c ** (1 / 2.4) * 1.055 - 0.055

Another option is to use the srgb textures to do the conversion automatically, as you sample from / write to them. This might also be hardware accelerated.

What's also good to know is that in the majority of images, the colors are already in sRGB format.


For your situation that means that you load the image, data is already sRGB, and with the bgra8unormsrgb texture it's converted to srgb again, resulting in the washed out colors.

You can either:

  1. Render to a non-srgb texture, because the image is already sRGB. But this also affects the other stuff being rendered.
  2. Load the image into an srgb texture, so that within the shader the colors are physical.
  3. Convert the colors to linear space right after sampling them from the texture.
  4. Convert the image colors to linear space on the CPU, before uploading to the texture.

In most cases the second option is probably the best. In pygfx we take option 3, because colors come into the shader with different routes. We use this function to convert the color in the shader:

fn srgb2physical(color: vec3<f32>) -> vec3<f32> {
    let f = pow((color + 0.055) / 1.055, vec3<f32>(2.4));
    let t = color / 12.92;
    return select(f, t, color <= vec3<f32>(0.04045));
}