Closed KammutierSpule closed 4 years ago
Quite possibly, but that'd be a quite pathological case (a lot of overlapping geometry). IMHO failing hard (via the CHECK failing) is not unreasonable in that case: it's not worth increasing the size of BVH tree nodes to handle more primitives than 64k, since that would be quite inefficient--any time a ray hits such a node, it's going to be doing a lot of intersection tests. Better, I think, to keep nodes small for regular usage.
No: remember that's looking at the bounds of the centroids of the geometry. That case could hit, for example, if one had multiple spheres at the origin, but with different radii. They'd all have a centroid bounds of (0,0,0) (and, there's no good way to partition them to make a good BVH), but we definitely don't want to make an empty leaf and ignore them.
Hi Matt, two questions:
On this case: if (centroidBounds.pMax[dim] == centroidBounds.pMin[dim]) { https://github.com/mmp/pbrt-v3/blob/master/src/accelerators/bvh.cpp#L266 Could it lead to nPrimitives >= 65536 and overflow at: https://github.com/mmp/pbrt-v3/blob/master/src/accelerators/bvh.cpp#L646 because linearNode->nPrimitives is a uint16 ?
Since we are checking for a null size? bounds at if (centroidBounds.pMax[dim] == centroidBounds.pMin[dim]) { can we just ignore this objects and do not create this leaf.. just completely ignore this and just add maybe an empty leaf?
An improvement could be to instead check for == add some threshold to ignore very small bounds.