yanx27 / Pointnet_Pointnet2_pytorch

PointNet and PointNet++ implemented by pytorch (pure python) and on ModelNet, ShapeNet and S3DIS.
MIT License
3.67k stars 893 forks source link

Inconsistent test loss results with PointNet++ during the evaluation #172

Open YasmeenAlsaedy opened 2 years ago

YasmeenAlsaedy commented 2 years ago

Hi all, We faced an issue with this implementation during the evaluation stage. We got inconsistent test loss results each time we evaluate the same data.

We traced the code and found the cause of this problem at line 75 in pointnet2_utils.py. For More details please check: https://discuss.pytorch.org/t/cannot-get-consistent-results-with-pointnet/142072


def farthest_point_sample(xyz, npoint):
    """
    Input:
        xyz: pointcloud data, [B, N, 3]
        npoint: number of samples
    Return:
        centroids: sampled pointcloud index, [B, npoint]
    """
    device = xyz.device
    B, N, C = xyz.shape
    centroids = torch.zeros(B, npoint, dtype=torch.long).to(device)
    distance = torch.ones(B, N).to(device) * 1e10
    farthest = torch.randint(0, N, (B,), dtype=torch.long).to(device) ############ The cause
    batch_indices = torch.arange(B, dtype=torch.long).to(device)
    for i in range(npoint):
        centroids[:, i] = farthest
        centroid = xyz[batch_indices, farthest, :].view(B, 1, 3)
        dist = torch.sum((xyz - centroid) ** 2, -1)
        mask = dist < distance
        distance[mask] = dist[mask]
        farthest = torch.max(distance, -1)[1]
    return centroids

Best, Yasmeen

GoostValley commented 1 year ago

Do you have any solution? The evaluation results vary every time.

YasmeenAlsaedy commented 1 year ago

Hi, Yes, we fixed it by removing randomization while sampling the farthest points (only during testing/evaluation). The modified version can be found here: https://github.com/eslambakr/LAR-Look-Around-and-Refer/blob/651f2db61dbd721e0af7c42f6db438fda24d5bd6/referit3d/models/backbone/visual_encoder/pointnet2_utils.py

GoostValley commented 1 year ago

many thanks!