Closed florian-schweiger closed 4 years ago
Very well noticed! I had tried to adapt the shader code for these platform-specific differences, but had not seen that it became an issue for disk-based blending, so this is definitely an important fix. Thank you very much for the commit!
It is quite confusing, and Unity's docs are not very clear on this either, unfortunately.
I understand that tex2D
look-ups inevitably return platform specific values, but I'm somewhat disappointed that Unity's functions/macros are not consistent in their behaviour (as far as I can tell).
There are probably other parts in the code that are affected by this. Disk blending is the only one I had a closer look at.
Fixes problems that prevent correct disk-based blending on OpenGL platforms.
Direct3D and OpenGL use different conventions for the range of the z-coordinate in normalised device coordinates as well as for the encoding of depth buffer values (see Platform-specific rendering differences).
In particular, the NDC z-range is [0,1] and [-1,1] in D3D and OpenGL, respectively. Also, raw depth buffer values in D3D are reversed, i.e. values range from 1(=near) to 0(=far), whereas OpenGL uses 0(=near) to 1(=far).
While many Unity macros and shader helper functions seem to abstract these differences away, this does cause problems in certain places. Notably on OpenGL platforms,
UnityObjectToClipPos
yields negative z-values close to the near clipping plane, and sampling high-precision depth textures produces raw values that increase with depth.This commit adds checks to make NDC z-scaling and depth testing work equally on OpenGL and Direct3D.