inuex35 / 360-gaussian-splatting

This repository contains programs for reconstructing 3D space using OpenSfM and Gaussian Splatting techniques. It allows users to generate point clouds from images captured by a 360-degree camera using OpenSfM, and then train Gaussian Splatting models using the generated point clouds.
Other
81 stars 10 forks source link

training process did not converge #9

Closed JunboLi-CN closed 2 days ago

JunboLi-CN commented 2 weeks ago

Dear Author(s),

Thank you for the great work! Could you please provide some example panorama testdata with which you have sucessfully done a reconstruction, because I have also tried your implementations but unfortunately the training process did not converge and I want to find out if my data have a problem.(If possible also with the output from opensfm) Thanks a lot!

Regards, Li

inuex35 commented 2 weeks ago

Hello you can try the data from the URL. I use omnigs branch of diff gaussian rasterization. https://dtbn.jp/IeaSdxhe (URL will expire in a week)

These are training paramters. self.iterations = 30_000 self.position_lr_init = 0.000016 self.position_lr_final = 0.00000016 self.position_lr_delay_mult = 0.01 self.position_lr_max_steps = 30_000 self.feature_lr = 0.0025 self.opacity_lr = 0.01 self.scaling_lr = 0.001 self.rotation_lr = 0.001 self.percent_dense = 0.01 self.lambda_dssim = 0.2 self.densification_interval = 100 self.opacity_reset_interval = 3000 self.densify_from_iter = 500 self.densify_until_iter = 15_000 self.densify_grad_threshold = 0.0002 self.random_background = False

JunboLi-CN commented 2 weeks ago

Hello again, thank you for the reply. I have tried your data and it converged. But I got another question, are the outputs only the rendered panorama views from a fixed camera position or is like the original 3DGS which is a 3D scene in which I can move the camera position to see it from different perspective. I have used the SIBR_viewer to check the result but it looks reasonable only in the initial camera position, once I moved the camera position, it looks completely wrong. So I am wondering if the goal of your project is to first reconstruct a 3D scene and then render panorama images use the reconstructed scene or just render the panorama images directly as outputs?

inuex35 commented 2 weeks ago

It does not happen from my environment. Could you tell me your environment? What branch do you use? Did you use my reconstruction.json? Reconstruction should be like this.

Sample image

Frame_00000_FinalColor

reconstruction

image

https://github.com/user-attachments/assets/63211e79-51c7-4f22-904f-e869556377b8

JunboLi-CN commented 1 week ago

Hello, Thank you for the information. I have tried to reinstall the environment but the result looks still wrong.

I have used depth_normal_render and omnigs branch and yes I have used your reconstruction.json file and the parameters you provided. Do you have some idea about the possible reason for that result?

Screencast from 09-02-2024 10:50:31 AM.webm

inuex35 commented 1 week ago

Thank you for the information. Could you try the main branch? Maybe depth normal render branch has some problems and I'm going to fix it.

JunboLi-CN commented 1 week ago

I have just tried the main branch and it works! Thank you for the reply and it helps a lot!