benhenryL / Deblurring-3D-Gaussian-Splatting

263 stars 9 forks source link

Question about the add_points #9

Open nyy618 opened 4 months ago

nyy618 commented 4 months ago

Thank you for your great work. You mentioned that KNN is used to compensate for the sparse points. I find add_points in gaussian_model.py seems weird.

        def find_nearest_neighbors(new_point, N):
            distances = torch.norm(existing_points - new_point, dim=1)
            nearest_indices = torch.topk(-distances, N).indices
            return nearest_indices, distances

        interpolated_colors = []
        for i, new_point in enumerate(additional_points):
            if i % 10000 == 0:
                torch.cuda.empty_cache()
            nearest_indices, distances = find_nearest_neighbors(new_point, N)
            interpolated_feature = torch.zeros_like(existing_color)

            weights = distances[nearest_indices]
            mask = weights < min_dist
            weights = weights[mask]
            near_color = existing_color[nearest_indices]
            near_color = near_color[mask]
            if len(weights) == 0:
                interpolated_feature = dummy_color
                mask_pts[i] = 0
            else:
                wsum = weights.sum()
                weights /= wsum
                interpolated_feature = (near_color * weights[:,None, None]).sum(0)

I wonder whether the neighbor points with shorter distances should be endowed with higher weights, so interpolated_feature = (near_color * weights[:,None, None]).sum(0) should be changed to interpolated_feature = (near_color * torch.flip(weights,dims=[0])[:,None, None]).sum(0)

benhenryL commented 3 months ago

Hi! First I apologize for the late response, I missed the notification. As you pointed out, weights should be flipped to assign more weight to the points closer to the new points and that is what we intended. I tested the first two scenes of each dataset (ball and basket for motion blur and cake and caps for defocus blur) with flipped weights and found no noticeable difference to the reconstruction quality. I think this is because only four points are involved in interpolation and most of them are located close to the new points so there is only a small gap between weights and flipped weights.

Following is a result with flipped weights, measured in PSNR, FYI reported vs. flipped weights Ball: 28.27 vs. 28.23 Basket: 28.42 vs. 27.90 Cake: 26.88 vs. 26.94 Caps: 24.50 vs. 24.34

Thank you for your attention and sorry for the confusion.