facebookresearch / pytorch3d

PyTorch3D is FAIR's library of reusable components for deep learning with 3D data
https://pytorch3d.org/
Other
8.59k stars 1.29k forks source link

Incorporate Normal Maps into the renderer #642

Open vikramjit-sidhu opened 3 years ago

vikramjit-sidhu commented 3 years ago

🚀 Feature

Incorporate normal mapping into the rendering pipeline.
A normal map is an image using which the mesh normals are manipulated before rendering.

Motivation

Normal mapping (or bump mapping) is a useful technique for enhancing the appearance and details of a low-polygon mesh.

Furthermore, normal maps are a part of many recent Computer Graphics research works.
Normal maps are created using a generative model to add details to a mesh Link.
It is used in inverse rendering techniques which estimate the normal maps from images Link.

Pitch

Incorporate normal maps into the rendering pipeline.
Since a normal map is essentially an image, its usage can be incorporated into the current Textures object, an example is as follows:

normal_map_img = torch.from_numpy(cv2.imread("normal_map.png"))
texture = Textures(verts_uvs=verts_uvs, faces_uvs=faces_uvs, normal_map=normal_map_img)
mesh.textures = texture

The actual implementation can be done in the Textures class.

I can also implement this feature if there are no plans on incorporating it in the near future.

nikhilaravi commented 3 years ago

@vikramjit-sidhu thanks for the suggestion. We currently don't have a plan to do this. Do you have a proposal for the implementation other than the API for adding it to the textures class? i.e. how do you want to integrate the normal map in the current shading pipeline?

vikramjit-sidhu commented 3 years ago

Hi @nikhilaravi, thanks for the reply.
The implementation can be within the shader (renderer/mesh/shader.py), alternatively it can also be done during the shading (renderer/mesh/shading.py).
We have the mesh along with the normals available in both places along with the textures (which will contain the normal map).

I will have to revise the implementation of normal mapping, I remember that it requires the tangents and the bi-tangents all of which can be computed with the information available.
In the end it can be implemented as a method or a class, initially a method probably makes more sense.

timlod commented 2 years ago

I implemented normal mapping for my application. I'll try to give some code examples here to help others who seek to do the same. I won't make a PR at this point because I did not create a nice, standard interface at this stage - I only need this to work for once specific use case.

In any case, I added normal maps (along with specular and roughness for PBR) to a Material. Maps are stored (in tangent space) as TexturesUV and sampled during shading. Hence, you will need to write a custom shader that updates pixel_normals using the normal map.

For the actual computations, I largely followed OpenGL tutorials such as http://www.opengl-tutorial.org/intermediate-tutorials/tutorial-13-normal-mapping/ or https://learnopengl.com/Advanced-Lighting/Normal-Mapping.

In the shader, I do something like this:

    pixel_normals = interpolate_face_attributes(
        fragments.pix_to_face, fragments.bary_coords, faces_normals
    )
    if materials.use_normal_map:
        pixel_normals = materials.apply_normal_map(
            pixel_normals,
            fragments,
            faces[: meshes.num_faces_per_mesh()[0]],
            verts[: meshes.num_verts_per_mesh()[0]],
        )

Here you can see that my implementation assumes that batches are homogenous (same faces/verts throughout). apply_normal_map looks like this:

    def apply_normal_map(self, pixel_normals, fragments, faces, verts):
        pix_to_face = fragments.pix_to_face
        batch_size = pix_to_face.shape[0]

        tangent = self.compute_tangent(verts[faces])
        # Smoothe the tangent map by interpolating per vertex tangent
        tangent_map = self.interpolate_face_average_attributes(
            tangent, fragments, verts, faces, batch_size
        )

        pixel_normals = F.normalize(pixel_normals, dim=-1)
        bitangent_map = torch.cross(pixel_normals, tangent_map, dim=-1)
        bitangent_map = F.normalize(bitangent_map, dim=-1)
        tangent_map = torch.cross(bitangent_map, pixel_normals, dim=-1)
        tangent_map = F.normalize(tangent_map, dim=-1)

        # pixel-wise TBN matrix - flip to get correct direction
        TBN = torch.stack(
            (-tangent_map, -bitangent_map, pixel_normals), dim=4
        ) 
        nm = self.normal_map.sample_textures(fragments)

        return F.normalize(
            torch.matmul(
                TBN.transpose(-1, -2).reshape(-1, 3, 3), nm.reshape(-1, 3, 1)
            ).reshape(pixel_normals.shape),
            dim=-1,
        )
lith0613 commented 2 years ago

@timlod Hi, can you share how to optimize normal maps (along with specular and roughness for PBR) in pytorch3d, I want to optimize these material maps by inverse rendering, then I can get these PBR maps and use them in other software such as unreal engine. If possible, would you share your demo about how to Incorporate Normal Maps into the renderer ? Thanks so much !