wbhu / Tri-MipRF

Tri-MipRF: Tri-Mip Representation for Efficient Anti-Aliasing Neural Radiance Fields, ICCV'23 (Oral, Best Paper Finalist)
https://wbhu.github.io/projects/Tri-MipRF
451 stars 13 forks source link

Difference between Zip-NeRF and Tri-MipRF? #2

Closed YJ-142150 closed 1 year ago

YJ-142150 commented 1 year ago

Great Work!!! Could you explain more about "Zip-NeRF introduces a multi-sampling-based method to address the same problem, efficient anti-aliasing, while our method belongs to the pre-filtering-based method."? Tri-MipRF seems to be much faster. Does the PSNR also higher than Zip-NeRF?

wbhu commented 1 year ago

Thanks for your interest. We didn't compare Tri-MipRF with Zip-NeRF in our paper as they are concurrent works. From my point of view, both of the two works try to address the efficiency issue for anti-aliasing NeRF, but Zip-NeRF adopts a multi-sampling-based strategy while our method belongs to the pre-filtering-based method. For PSNR, we can only compare the results on the Blender dataset at the current point. image (Tab.5 from our paper)

image (Tab.4 from ZipNeRF paper)

YJ-142150 commented 1 year ago

Wow! It seems Tri-MipRF is better than Zip-NeRF in Blender synthetic dataset!! Did you tried on nerf-360 dataset, too?

tianxiaguixin002 commented 1 year ago

Can Tri-MipRF support 360 unbounded scene ? the experiment datasets in paper are objects with mask. thanks.

wbhu commented 1 year ago

Currently, it cannot support unbounded 360 scenes