Open Misakatyan opened 3 months ago
Maybe you can tell me more detailed information? Since our model uses disparity as input, there may be some problems with points with a depth of 0. We can interpolate the depth of this part and then calculate the disparity. Or it may not perform well under large masks outdoors. In addition, we expect to release a better model in a month, so stay tuned.
Okay, thank you for your reply. Due to network restrictions in the company, I am unable to provide images, so I will try my best to describe the problem I have encountered. I tested the kitchen scene in mip-360 and selected a reference image with the toy LEGO shovel in the middle of two pieces of cloth. However, when I tried to project the 2D inpaint results back into 3D space based on a seemingly perfect depth map, the inpainted part always floated under the table at an oblique angle to the plane of the table. I also tried using the save_ply function to unpoject all the point clouds, and found that there was a significant tilt angle between the unpoject point clouds and the original 3dgs, which could not be matched (the same was true for the depth obtained by rendering instead of inpainting). I also checked the camera's internal and external parameters for the corresponding images and did not find any issues, so where do you think the problem may lie? Thank you very much for your help.
I have a similar issue which could be shown in the following two images. The inpainting area looks fine in the inpainting view, but the true position tilt angle with the ground surface.
Also, finetuning on one view will cause invisible gaussians in other part randomly tuning bad.
I have completed each step of the previous operation as required and obtained the correct depth estimation data, but the projection was not accurate when generating the point cloud in the last step. Has anyone encountered the same problem? If possible, please let me know how to solve it. Thank you.