armory3d / armory

3D Engine with Blender Integration
https://armory3d.org
zlib License
3.06k stars 316 forks source link

HDR Is not uniform #1277

Closed TylerMcKenzie closed 3 years ago

TylerMcKenzie commented 5 years ago

SHORT DESCRIPTION:

HDR is not uniformly mapped to scene.

EXPECTED RESULT:

HDR is uniform

ACTUAL RESULT:

My HDR is lemon.

HDR

hdr_weirdness

Nodes

nodes

BlackGoku36 commented 4 years ago

Looks like the hdr produce by cmft is at fault https://forums.armory3d.org/t/equirectangular-world-texture/3801/5?u=blackgoku36 (Left is original, right is generate by cmft)

Screen Shot 2020-01-03 at 2 02 58 PM Screen Shot 2020-01-03 at 2 02 32 PM
MoritzBrueckner commented 3 years ago

Hi, I'm currently revisiting this topic for the implementation of the nishita sky model for the upcoming Blender 2.9 support. I believe that this issue is not caused by cmft, but instead by the geometry of the skydome itself, which seems to be a "sphere" with a rather flat lower half.

You can see that the same distortion happens on non-environment-map backgrounds too, the sun should be completely round in the following screenshot:

nishita_sun_distorted

This is the world geometry displayed in RenderDoc:

skydome_geometry

@luboslenco is there a reason for this? Do you remember how the vertex coordinates in ConstData.hx were generated?

BlackGoku36 commented 3 years ago

I still think there is issue with cmft, mine show issues happen horizontally. How does cmft output appear in build file? We could just use cube or simple sphere primitive that blender make and set skydome to appear last setting depth to 1 in vertex shader (this is how most people do). If i remember correctlly what armory do is adjust skydome size based on camera's far plane.

MoritzBrueckner commented 3 years ago

It's possible that there is also a cmft issue (your screenshots indeed look like that), but this particular lemon-like distortion described in the first post is very likely not related to cmft as you can see the same effect in different kinds of world shaders (sun example is above, here is another screenshot where you can see that the geometry is problematic). Normals are distorted by the non-sphere geometry and I guess that they even are distorted a tiny bit above the horizon due to interpolation between vertices. The texture coordinates for environment maps are actually calculated based on the world normals (source).

Cmft is only used for radiance and irradiance map generation as far as I know.

A simple sphere (icosphere maybe?) primitive would likely be the best solution. A cube is helpful for cubemaps and it has less geometry (which is neglible I guess), but with a unit sphere we have normals = position = direction which is absolutely helpful for skies and other procedural shaders.

I'm not sure about the skydome size, but I guess that it doesn't matter because we don't even write its depth to the depth buffer (source). But drawing it last with depth 1 might indeed be a good idea to have less fragments drawn due to depth testing (@luboslenco what do you think about that?), there is even a problem with the current approach when it comes to SSR (screen space reflections), the world is not visible in the reflection).

luboslenco commented 3 years ago

@MoritzBrueckner hm don't recall the ConstData vertices generation, the regular sphere you suggested sounds much better.