Closed vtushevskiy closed 2 years ago
On CPU side, in C++: 1)precompute tangent and bitangent based on triangle's edges and UV coordinates as something like dP/dUV 2) store it in the new vertices array In GLSL shader: 3)interpolate this saved tangents/bitangents by barycentric coordinates for each sample
Or may be there is better, more obvious way?
Hi,
The barycentric coordinates and the texture uv coords are calculated during triangle intersection here: https://github.com/knightcrawler25/GLSL-PathTracer/blob/5218db5d9794f10add9eb3774eae2e84c4896b43/src/shaders/common/closest_hit.glsl#L232-L248
A normal map is looked up based on the texture uv coords that were calculated earlier and an orthonormal basis is calculated to orient this normal based on the surface normal:
Even when not using a normal map, the tangent and bitangent from the Onb() function are used when sampling GGX and I haven't noticed issues so far: https://github.com/knightcrawler25/GLSL-PathTracer/blob/5218db5d9794f10add9eb3774eae2e84c4896b43/src/shaders/common/disney.glsl#L198-L199
Would you be able to provide an example for me to better understand the issue?
Update: I found an article around handedness and that some models might contain reversed uvs that require T to be flipped...perhaps this was what you were referring to? http://www.opengl-tutorial.org/intermediate-tutorials/tutorial-13-normal-mapping/#handedness
My question is how do you calculate tangent and bitangent vectors - 0nb() function. I think it should be done based on UV derivative. T = dP/dU; B = dP/dV; N = normal. (dU, dV - change of UV respectively) For example here https://learnopengl.com/Advanced-Lighting/Normal-Mapping and here https://ogldev.org/www/tutorial26/tutorial26.html is a some math to calculate tangent and bitanged which is you are probably skipping. Perhaps your way is also right, I just wanted to clarify it and double check.
you do:
vec3 UpVector = abs(N.z) < 0.99999 ? vec3(0, 0, 1) : vec3(1, 0, 0);
T = normalize(cross(UpVector, N));
B = cross(N, T);
but what if model is rotating? what if normals texture has different orientation? the result will be wrong I think
UPDATE: http://www.thetenthplanet.de/archives/1180 UPDATE 2:
mat3 cotangent_frame( vec3 N, vec3 p, vec2 uv )
{
// get edge vectors of the pixel triangle
vec3 dp1 = dFdx( p );
vec3 dp2 = dFdy( p );
vec2 duv1 = dFdx( uv );
vec2 duv2 = dFdy( uv );
// solve the linear system
vec3 dp2perp = cross( dp2, N );
vec3 dp1perp = cross( N, dp1 );
vec3 T = dp2perp * duv1.x + dp1perp * duv2.x;
vec3 B = dp2perp * duv1.y + dp1perp * duv2.y;
// construct a scale-invariant frame
float invmax = inversesqrt( max( dot(T,T), dot(B,B) ) );
return mat3( T * invmax, B * invmax, N );
}
UPDATE 3: dFdx function does not work out of screen so it will not work good for hidden geometry...
I read the articles you linked and I get what you're saying now. The way I'm calculating the tangent and bitangent isn't right for normal mapping but works fine for surface shading (where orientation of the tangent and bitangent around the normal isn't important). Here's how it looks like when a plane with a normal map is rotated (light is stationary). There are also artifacts where the perturbed normal goes below the surface.
PBRT seems to calculate the tangent (if it wasn't already supplied by the model) from the partial derivatives and calculates the bitangent from the cross product of the tangent and normal: https://www.pbr-book.org/3ed-2018/Shapes/Triangle_Meshes. I'll try going the same route and see if I run into any other issues. Thanks for pointing this out!
Very good! The next step is to figure out how to process your triangles. in the matrix equation, as I understand, you suppose tou use 3 vertices of triangle: p0, p1, p2 and their UVs: uv0, uv1, uv2. After finding tangents and bitangents for this points, you probably will have to interpolate it with barycentric coordinates. Since all your stuff is happening in fragment shader, I think you have two ways: 1) calculate those tangents and bitangents in the main program (C++ part - since there is no traditional vertex shader) and store it in additional buffer. 2) do the triple amount of work and calculate it in each call of ClosestHit() for all 3 vertices of current triangle and interpolate it.
Instead of that we can do that using barycentric coordinates: we can take P0, UV0 as current point and calculate P1 from UV0+dUV based on barycentric coordinates conversion instead of using triangles vertices
Is onb revisited can solve this problem? Please show same animation with this code
// Building an Orthonormal Basis, Revisited
// by Tom Duff, James Burgess, Per Christensen, Christophe Hery, Andrew Kensler, Max Liani, Ryusuke Villemin
// https://graphics.pixar.com/library/OrthonormalB/
//-----------------------------------------------------------------------
void Onb(in vec3 N, inout vec3 T, inout vec3 B)
//-----------------------------------------------------------------------
{
float sgn = N.z >= 0.0f ? 1.0f : -1.0f;
float aa = - 1.0f / (sgn + N.z);
float bb = N.x * N.y * aa;
T = vec3(1.0f + sgn * N.x * N.x * aa, sgn * bb, -sgn * N.x);
B = vec3(bb, sgn + N.y * N.y * aa, -N.y);
}
@tigrazone: Looks like the Pixar paper only deals with precision issues. However, I was able to fix the issues by using the method from https://learnopengl.com/Advanced-Lighting/Normal-Mapping
@vtushevskiy Turns out there was another problem with the tangent and bitangent not being rotated by the transformation matrix (similar to the normal) which is why the shadows also rotated with the map. https://github.com/knightcrawler25/GLSL-PathTracer/blob/5218db5d9794f10add9eb3774eae2e84c4896b43/src/shaders/common/closest_hit.glsl#L245
It is now fixed along with the issue with dark patches at grazing angles.
Here is a before and after:
I'll clean up the code and update the repo.
@knightcrawler25 I would also recommend to test the normal map on something like rotating sphere with couple of lights aside
"Turns out there was another problem with the tangent and bitangent not being rotated by the transformation matrix (similar to the normal) which is why the shadows also rotated with the map. "
yeah it is necessary to get all vectors in world space
"Looks like the Pixar paper only deals with precision issues. However, I was able to fix the issues by using the method from https://learnopengl.com/Advanced-Lighting/Normal-Mapping"
I see they use glm library to access to geometry and calculate tangent/bitangent. If you have millions of polygons it may take a while to do that consequentaly, it has to multithreaded
I see they use glm library to access to geometry and calculate tangent/bitangent. If you have millions of polygons it may take a while to do that consequentaly, it has to multithreaded
For now, I'm calculating the tangents on the fly whenever a triangle is intersected as the shader has access to the vertices and uvs.
@knightcrawler25 I would also recommend to test the normal map on something like rotating sphere with couple of lights aside
Seems to be working fine:
Two lights and a HDR:
Same scene in RenderMan: (Some subtle differences if you flip between the two renders)
It looks like it is working!
in GetMaterials() function you have:
state.normal = normalize(state.tangent * texNormal.x + state.bitangent * texNormal.y + state.ffnormal * texNormal.z);
state.ffnormal = normalize(state.normal);
do you think 'ffnormal' should be equal to 'normal'?
I pushed a change to the repo. Now, it should work properly when looking at backfaces as well.
Some quick tests to confirm:
The way you calculting tangent and bitangent in the function 0nb() looks very simple. But is it right?
I think it will not produce right results, it may flip coordinates of normal map. Shouldn't we precompute tangent and bitangent based on triangle's edges and UV coordinates as something like dP/dUV?