Open wtholliday opened 5 years ago
Could you post a screenshot? (TIFF screenshots on Mac preserve 10-bit depth). 10-bit color gradients should be visually indistinguishable from ∞-bit IMO. Perhaps it's an issue with the shader quantizing the colors.
@AndrewBelt See the banding? It's much easier to see in full-screen.
That's a radial gradient rendered between #16171A and #26282B.
I think 10 bit color still has to be dithered for subtle gradients.
The shader seems to use floats for each color channel.
Apple's code dithers, FWIW.
Yes, I do see about ~10 bands in this. However, this is an 8-bit PNG. You can take TIFF screenshots on Mac with https://www.idownloadblog.com/2014/07/31/how-to-change-mac-screenshot-file-format/ which should preserve 10-bit pixels. What I'm curious about is whether nanovg's rasterizer gives you ~10 or ~40 bands on your screen. The former would suggest that the pixels are being converted to 8-bit before being displayed as 10-bit, which is a problem that should be handled before dithering is tackled.
@AndrewBelt Sorry, I'm having trouble getting you an image, because I can only reproduce on iPad. When I did a screenshot on iPad, it gets converted to a PNG.
On iPad, I see ~40 bands in 10bit color, and it's easy to see the difference. The banding is more subtle, but it's still noticeable. Can you take my word for it?
Besides, isn't it worthwhile to render nice gradients in 8 bit color?
@wtholliday You might find this article comparing various dithering algorithms useful: https://developer.oculus.com/blog/tech-note-shader-snippets-for-efficient-2d-dithering/
To my eye, the minimal ALU dither17
routine looks better than the bayer matrix dithering you linked above, and should also be faster. For high quality dithering, blue noise is definitely the way to go, but it requires a texture or large constant array.
~40 bands in 10bit color
Okay, 10-bit color works then.
I did a dither implementation in my own backend implementation for NanoVG, https://github.com/marcel303/framework/blob/2a449a801860118664d748cb13474d3015b6d4f3/3rdparty/nanovg-framework/nanovg-framework.cpp#L102
It's enabled when NVG_DITHER_GRADIENTS is set at context creation time.
The dither function used (a slightly modified version used by the game Portal) takes the pixel coordinate and returns a tiny bit of noise. This noise gets added to the gradient rgb inside the fill shader. https://github.com/marcel303/framework/blob/2a449a801860118664d748cb13474d3015b6d4f3/framework/data/engine/ShaderUtil.txt#L305
By the way, this is what it looks like,
Compared to this screenshot from the NanoVG repo,
I'm rendering gradients which appear banded even in BGR10 color depth (10 bits per color channel).
So I was thinking of adding gradient dither to nanovg and doing a pull request. Would that be appreciated? Anyone have some tips on what sort of dithering to implement?
This technique looks simple: http://alex-charlton.com/posts/Dithering_on_the_GPU/