Open KammutierSpule opened 7 years ago
I managed to make changes on Ogre shader so it will close match Unity results that I was expecting. My setup is:
Unity is using Smoothness component to calc Roughness. I found on their shaders that they do:
perceptualRoughness = (1.0 - smoothness)
roughness = perceptualRoughness * perceptualRoughness
Another important difference is the calc of "Roughness/Distribution/NDF term (GGX)" They use the squared term of roughness in to calc the a2 (so in the end it will be ^4), so we will have
a2 = roughness * roughness; // this is from roughness = perceptualRoughness * perceptualRoughness
f = ( NdotH * a2 - NdotH ) * NdotH + 1.0;
R = a2 / (f * f + 1e-6f);
Their diffuse term is also different but the tests I did it looks closer if I keep the Ogre original source code.
Here are my results:
This may need more investigation on the differences.
How do you compare the different shading that Ogre/Unity is using?
I am trying to get the similar render results on Ogre and on Unity. but I am getting different results, please have a look on this thread
Also, I was comparing the Unity shader source code and the Ogre shader (BRDFs_piece_ps.glsl), from my understanding, Unity is using something similar to Default BRDF (not the Default | Uncorrelated) but with an optimized different math. (documentation suggest to use DefaultUncorrelated to get Unity results) (Note: on that function on unity shader, they have a #if 0, so they are not using the more correct math)
I was trying to change the Ogre shader for the "Smith GGX Height-Correlated" term but it didn't get the results I expect.
Could you clarify about this shader differences? Which one is more physical correct?