facebookresearch / pytorch3d

PyTorch3D is FAIR's library of reusable components for deep learning with 3D data
https://pytorch3d.org/
Other
8.7k stars 1.3k forks source link

Add `max` point reduction for chamfer distance #1838

Closed JulianKnodt closed 1 month ago

JulianKnodt commented 2 months ago

🚀 Feature

Add a max point reduction for chamfer distance. This should correspond to the hausdorff distance between two meshes, which I think is not suitable for optimization since it will have sparse gradients, but should be useful for evaluating quality of output.

NOTE: Please look at the existing list of Issues tagged with the label 'enhancement`. Only open a new issue if you do not see your feature request there.

Motivation

When evaluating the quality of a 3D mesh as compared to another, it can be useful to consider both chamfer distance and the hausdorff distance. This is a feature in meshlab and libigl, and a common metric in papers.

Pitch

This should allow for conveniently switching between chamfer and hausdorff distance. I've copied the code from the landing page, and only changed the last line to adjust it to be hausdorff distance.

from pytorch3d.utils import ico_sphere
from pytorch3d.io import load_obj
from pytorch3d.structures import Meshes
from pytorch3d.ops import sample_points_from_meshes
from pytorch3d.loss import chamfer_distance

# Use an ico_sphere mesh and load a mesh from an .obj e.g. model.obj
sphere_mesh = ico_sphere(level=3)
verts, faces, _ = load_obj("model.obj")
test_mesh = Meshes(verts=[verts], faces=[faces.verts_idx])

# Differentiably sample 5k points from the surface of each mesh and then compute the loss.
sample_sphere = sample_points_from_meshes(sphere_mesh, 5000)
sample_test = sample_points_from_meshes(test_mesh, 5000)
loss_hausdorff, _ = chamfer_distance(sample_sphere, sample_test, point_reduction="max")