wutong16 / Voxurf

[ ICLR 2023 Spotlight ] Pytorch implementation for "Voxurf: Voxel-based Efficient and Accurate Neural Surface Reconstruction"
Other
399 stars 28 forks source link

train with custom data #26

Open wmrenr opened 1 year ago

wmrenr commented 1 year ago

Hello, first of all, thank you for your great contribution! I trained the network with custom 60 images, and the reconstructed model contains holes, as shown in the following figure. I tried to adjust the parameter training, but still couldn't solve the problem. Can you help me? 1 And I found that some of the reconstruction results I trained with images of other objects didn't contain holes. So I guess this training result is related to the collection effect of images, so how can images be collected be beneficial for the experimental results? Can you give some suggestions?

wutong16 commented 1 year ago

Hi!

I believe the hole could potentially be attributed to an inaccurate bounding box estimation during the data preprocessing phase. Please examine whether the hole resembles a typical cut created by a plane.

To address this issue, you can begin by manually increasing the bounding box area by adding a small buffer (e.g., 0.1) to xyz_max and subtracting a small buffer from xyz_min.

Additionally, when capturing the video, strive to keep the object approximately centered within the frames and maintain a clean background. These practices will facilitate the production of precise masks and improve the accuracy of the bounding box estimation process.

wmrenr commented 1 year ago

Thank you for your guidance! I observed the hole and found that it looks like a typical cut created by a plane. But I don't understand where to add a small buffer (e.g., 0.1) to xyz_max and subtracting a small buffer from xyz_min. Are they the compute_bbox_by_cam_frustrm() function and the compute_bbox_by_coarse_geo() function in the run.py file?

wutong16 commented 1 year ago

Sorry for the delayed follow-up! Please change the setting here, where the model is being initialized.