Open yuedajiong opened 11 months ago
I am also working on implementing the regularizations, I am at the last one for smothing
i have experience with poisson reconstruction if that is what you need
Hello guys,
Sorry for not answering a lot recently, I was working hard on several projects.
As I explained in other issues, the code is basically finished and coming soon. Very, very soon, actually: I plan to release it today.
I just have a few tests to perform to check that the env file is working, and that I've not broken the code when reorganizing it.
Stay tuned, if everything goes well (and no bug is here), the code is out today.
@Anttwo You're awesome, you're my idol.
Waiting for your code, and try.
If you're willing, I'm happy to serve you by testing and optimizing code.
The surface recon is the most important one.
hello @yuedajiong, I have contacted you via email. I am looking forward to your reply!
Hi @yuedajiong and @cdcseacave , I am trying to implement the sdf and normals regularization on the orginal 3DGS. However, I met the problem that the density would be very large number with the equation in the paper. The code is attached below. Could you please take a look. Thanks!
`# sdf loss gaussians._xyz
# 随机取当前高斯点当作sample_points
valid_indices = torch.arange(30000, device="cuda")
cum_probs = valid_indices / 29999
random_indices = torch.multinomial(cum_probs, 30000, replacement=True)
random_change = torch.randn_like(
gaussians._xyz[random_indices]) * 10 # 生成与 gaussians._xyz[random_indices] 相同形状的随机数张量
sdf_samples = gaussians._xyz[random_indices] + random_change
# 计算sdf estimation
sdf_samples_z = sdf_samples[..., 2] + 0
sdf_samples_xy = sdf_samples[..., 0:2] + 0
proj_mask = sdf_samples_z > viewpoint_cam.znear
# 获取点在rendered depth上的深度
# 将点的坐标调整为深度图上的范围(-1到1之间)
points_normalized = sdf_samples_xy.clone()
points_normalized[:, 0] = (sdf_samples_xy[:, 0] / (depth.shape[2] - 1))
points_normalized[:, 1] = (sdf_samples_xy[:, 1] / (depth.shape[1] - 1))
grid = points_normalized.view(1, -1, 1, 2) # 将点的坐标构建成形状为 (1, num_points, 1, 2) 的 grid
sdf_samples_map_z = torch.nn.functional.grid_sample(rendered_depths, grid, mode='bilinear',
padding_mode='border')[0, 0, :, 0]
sdf_estimation = sdf_samples_map_z[proj_mask] - sdf_samples_z[proj_mask]
# df real
beta = gaussians._scaling.min(dim=-1)[0][random_indices].mean(dim=0)
opacity = gaussians._opacity[random_indices]
distance = sdf_samples - gaussians._xyz[random_indices]
covariance = gaussians.get_actual_covariance()[random_indices]
densities = torch.zeros(30000, 1)
for i in range(0, 29999):
density = opacity[i] * (
-0.5 * distance[i, :].T @ torch.inverse(torch.squeeze(covariance[i, :, :])) @ distance[i, :])
densities[i, 0] = density
density_threshold = 1.
opacity_min_clamp = 1e-16
clamped_densities = densities.clamp(min=opacity_min_clamp).to("cuda")
sdf_values = beta * torch.sqrt(-2. * torch.log(clamped_densities))
# sdf standerd deviation
sdf_estimation_loss = torch.mean((sdf_values - sdf_estimation.abs()))
Hi @neneyork, do you successfully implement the loss? I'm also trying to do that.
I urgently need a partner for discussion who shares the same direction.