wutong16 / Voxurf

[ ICLR 2023 Spotlight ] Pytorch implementation for "Voxurf: Voxel-based Efficient and Accurate Neural Surface Reconstruction"
Other
404 stars 27 forks source link

How to support unbounded custom 360 dataset? #10

Open Learningm opened 1 year ago

Learningm commented 1 year ago

I read the code and find that wo/mask setting just works for DTU dataset. Could you give me some suggestion on how to support unbounded custom 360 datasest without mask?

wutong16 commented 1 year ago

Hi @Learningm,

For the configuration, you can copy the config files for DTU, namely .configs/dtu_e2e_womask, and change the corresponding data settings to your own data.

For data preparation, we follow the setting of several previous works like nerf++ and assume that the foreground object lies within the unit ball. You'll need a coarse mask of the foreground object for each image and use our preprocessing code to conduct the foreground normalization.

Learningm commented 1 year ago

@wutong16 Cool, thanks for the quick reply!

As you mentioned above, it seems that we still need a coarse mask for foreground object in preprocessing stage. I'll try it.

Learningm commented 1 year ago

Hi, @wutong16 , I use dtu_e2e_womask config to process dtu data & my custom data.

DTU data result is good. My custom data result is good with w/mask setting, psnr value almost 30, but when using wo/mask setting, the result is much more blurry, psnr value < 20. Could you give some advise on how to improve result with wo/mask setting?

Another question, I find that the rendered video using the script just interpolation between angle1 and angle2, using --interpolate, is there existing a config to render a 360 degree video around the center object ?

zhouilu commented 11 months ago

Hi, @wutong16 , I use dtu_e2e_womask config to process dtu data & my custom data.

DTU data result is good. My custom data result is good with w/mask setting, psnr value almost 30, but when using wo/mask setting, the result is much more blurry, psnr value < 20. Could you give some advise on how to improve result with wo/mask setting?

Another question, I find that the rendered video using the script just interpolation between angle1 and angle2, using --interpolate, is there existing a config to render a 360 degree video around the center object ?

Same question, have you solved it? Can you give me some advice?