gkjohnson / three-gpu-pathtracer

Path tracing renderer and utilities for three.js built on top of three-mesh-bvh.
https://gkjohnson.github.io/three-gpu-pathtracer/example/bundle/index.html
MIT License
1.34k stars 133 forks source link

Add support for black and white Bump Maps to compute normals PR #559

Closed dongho-shin closed 6 months ago

dongho-shin commented 6 months ago

related : https://github.com/gkjohnson/three-gpu-pathtracer/issues/558

also this pr has a problem(I write a details in related issue #558 )

bumpMap applied not fully success(especially plane)

image Untitled (28)
gkjohnson commented 6 months ago

Thanks - this is easier to understand. I think it's best to look at how normal mapping is handled in the path tracer and use similar code. A bump map is basically a normal map - you just have to compute the normals from a notional "height" computed from the black and white values.

gkjohnson commented 6 months ago

You'll also want to make sure that tangents are generated when bump maps are present instead of just when normal maps are:

https://github.com/gkjohnson/three-gpu-pathtracer/blob/main/src/core/DynamicPathTracingSceneGenerator.js#L83-L84

dongho-shin commented 6 months ago

@gkjohnson i miss that... I'll check tangents I use only bumpMap at mesh and I can't get tangent attributes normally at shader

dongho-shin commented 6 months ago

I'm pretty new in glsl or graphics so i don't have a idea how to do dFdx and dFdy manually can i get a hint?

This is as far as I understand it at this point. surf_pos (- vViewPosition) that input of perturbNormalArb(in my pr and three.js bumpMap function) from vertex shader so its value has no effected pathTracing sampling but dFdx, dFdy (surf_pos) has effected right? because dFdx dFdy get near gl_Fragcoord and get change of x or y automatically

then should I get a near vertex position in screen space and add by uniform and use it in glsl?

** I also check blender cycles osl codes(node_bump.osl)

/* "Bump Mapping Unparameterized Surfaces on the GPU"
 * Morten S. Mikkelsen, 2010 */

surface node_bump(int invert = 0,
                  int use_object_space = 0,
                  normal NormalIn = N,
                  float Strength = 0.1,
                  float Distance = 1.0,
                  float SampleCenter = 0.0,
                  float SampleX = 0.0,
                  float SampleY = 0.0,
                  output normal NormalOut = N)
{
  point Ptmp = P;
  normal Normal = NormalIn;

  if (use_object_space) {
    Ptmp = transform("object", Ptmp);
    Normal = normalize(transform("object", Normal));
  }

  /* get surface tangents from normal */
  vector dPdx = Dx(Ptmp);
  vector dPdy = Dy(Ptmp);

  vector Rx = cross(dPdy, Normal);
  vector Ry = cross(Normal, dPdx);

  /* compute surface gradient and determinant */
  float det = dot(dPdx, Rx);
  vector surfgrad = (SampleX - SampleCenter) * Rx + (SampleY - SampleCenter) * Ry;

  float absdet = fabs(det);

  float strength = max(Strength, 0.0);
  float dist = Distance;

  if (invert)
    dist *= -1.0;

  /* compute and output perturbed normal */
  NormalOut = normalize(absdet * Normal - dist * sign(det) * surfgrad);
  NormalOut = normalize(strength * NormalOut + (1.0 - strength) * Normal);

  if (use_object_space) {
    NormalOut = normalize(transform("object", "world", NormalOut));
  }
}

I think it's same in three.js codes still work in progress...

dongho-shin commented 6 months ago
image

I fix a shader but it looks fine as you said I caculate height like normal map doing and add it to normal

normal += vTBN * perturbNormalArb( -vViewPosition, (normal * 2.0 - 1.0), dHdxy ,  surfaceHit.side);
gkjohnson commented 6 months ago

I'm pretty new in glsl or graphics so i don't have a idea how to do dFdx and dFdy manually can i get a hint?

Of course! Feel free to ask questions. I'm happy to have some help adding additional features to the project.

Regarding the dFdX and dFdY functions - they compute the apparent change in a variable from pixel to pixel in screen space. It does this by checking sibling pixels and computing the difference between the variable value. This great for traditional rasterization but of course because all of these path tracing operations occur as a light ray bounces around a scene it's not appropriate to compute the change in texture value in screen space.

Likewise the idea of using a "view position" or "direction" (which is based on the view ray from the original camera position) doesn't make sense since the rays can hit a surface from any direction after bouncing around. This is why the perturbNormalArb cannot be used.

So on to how to compute the derivative (dFdX, dFdY) of the texture without using the built in functions - the derivative is the change in a variable over time along some dimension. So, instead, to manually compute the change along the X axis, for example, you can sample the value at the point hit and one pixel to the left and right and compute the derivative from those values.

Since a bump map is basically a height map the derivative of a pixel represents the slope of the surface represented in the bump map which you can use to compute the normal.

Hopefully this was helpful - let me know if you need me to elaborate on anything.

dongho-shin commented 6 months ago

then approach would simple with conversion of bump to normal... I'm going to find a solution of it