Closed TylerMcKenzie closed 3 years ago
Looks like the hdr produce by cmft is at fault https://forums.armory3d.org/t/equirectangular-world-texture/3801/5?u=blackgoku36 (Left is original, right is generate by cmft)
Hi, I'm currently revisiting this topic for the implementation of the nishita sky model for the upcoming Blender 2.9 support. I believe that this issue is not caused by cmft, but instead by the geometry of the skydome itself, which seems to be a "sphere" with a rather flat lower half.
You can see that the same distortion happens on non-environment-map backgrounds too, the sun should be completely round in the following screenshot:
This is the world geometry displayed in RenderDoc:
@luboslenco is there a reason for this? Do you remember how the vertex coordinates in ConstData.hx
were generated?
I still think there is issue with cmft, mine show issues happen horizontally. How does cmft output appear in build file? We could just use cube or simple sphere primitive that blender make and set skydome to appear last setting depth to 1 in vertex shader (this is how most people do). If i remember correctlly what armory do is adjust skydome size based on camera's far plane.
It's possible that there is also a cmft issue (your screenshots indeed look like that), but this particular lemon-like distortion described in the first post is very likely not related to cmft as you can see the same effect in different kinds of world shaders (sun example is above, here is another screenshot where you can see that the geometry is problematic). Normals are distorted by the non-sphere geometry and I guess that they even are distorted a tiny bit above the horizon due to interpolation between vertices. The texture coordinates for environment maps are actually calculated based on the world normals (source).
Cmft is only used for radiance and irradiance map generation as far as I know.
A simple sphere (icosphere maybe?) primitive would likely be the best solution. A cube is helpful for cubemaps and it has less geometry (which is neglible I guess), but with a unit sphere we have normals = position = direction
which is absolutely helpful for skies and other procedural shaders.
I'm not sure about the skydome size, but I guess that it doesn't matter because we don't even write its depth to the depth buffer (source). But drawing it last with depth 1 might indeed be a good idea to have less fragments drawn due to depth testing (@luboslenco what do you think about that?), there is even a problem with the current approach when it comes to SSR (screen space reflections), the world is not visible in the reflection).
@MoritzBrueckner hm don't recall the ConstData vertices generation, the regular sphere you suggested sounds much better.
SHORT DESCRIPTION:
HDR is not uniformly mapped to scene.
EXPECTED RESULT:
HDR is uniform
ACTUAL RESULT:
My HDR is lemon.
HDR
Nodes