[Paper] https://arxiv.org/abs/2107.10981
The code has been tested in the following environment:
Package | Version | Comment |
---|---|---|
PyTorch | 1.9.0 | |
point_cloud_utils | 0.18.0 | For evaluation only. It loads meshes to compute point-to-mesh distances. |
pytorch3d | 0.5.0 | For evaluation only. It computes point-to-mesh distances. |
pytorch-cluster | 1.5.9 | We only use fps (farthest point sampling) to merge denoised patches. |
conda env create -f env.yml
conda activate score-denoise
Download link: https://drive.google.com/file/d/1ZZ3EON8TTtwoRciT5ThcYU3sTtj9Kj7Z/view?usp=sharing
Please extract score_dataset.zip
to data
folder. It concludes PU-Net, PCNet, and noisy LiDAR dataset.
Download link: https://drive.google.com/file/d/1NsNtRR9qhZRsc4GkUGIpU6FXMZpL5UFB/view?usp=sharing
Please extract checkpoints.zip
to checkpoints
folder. It concludes basic model(best.pt
), ablation study model (ablation2_best.pt
), and unsupervised model (unsup_best.pt
).
# basic training (supervised)
python train.py
# unsupervised training
python train.py --unsup True
# ablation study 2
python train.py --ablation2 True
Training time takes about 39~40 hours.
# PUNet 10K
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 1 --noise 0.01
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 1 --noise 0.02
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 2 --noise 0.03
# PUNet 50K
python test.py --dataset PUNet --resol 50000_poisson --denoise_iters 1 --noise 0.01
python test.py --dataset PUNet --resol 50000_poisson --denoise_iters 1 --noise 0.02
python test.py --dataset PUNet --resol 50000_poisson --denoise_iters 2 --noise 0.03
# PCNet 10K
python test.py --dataset PCNet --resol 10000_poisson --denoise_iters 1 --noise 0.01
python test.py --dataset PCNet --resol 10000_poisson --denoise_iters 1 --noise 0.02
python test.py --dataset PCNet --resol 10000_poisson --denoise_iters 2 --noise 0.03
# PCNet 50K
python test.py --dataset PUNet --resol 50000_poisson --denoise_iters 1 --noise 0.01
python test.py --dataset PUNet --resol 50000_poisson --denoise_iters 1 --noise 0.02
python test.py --dataset PUNet --resol 50000_poisson --denoise_iters 2 --noise 0.03
# Ablation study (1)
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 1 --noise 0.01 --ablation1 True
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 1 --noise 0.02 --ablation1 True
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 1 --noise 0.03 --ablation1 True
# Ablation study (1)+iters.
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 1 --noise 0.01 --ablation1 True
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 1 --noise 0.02 --ablation1 True
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 2 --noise 0.03 --ablation1 True
# Ablation study (2): use checkpoint trained by ablation2
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 1 --noise 0.01 --checkpoint ./checkpoints/ablation2_best.pt
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 1 --noise 0.02 --checkpoint ./checkpoints/ablation2_best.pt
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 2 --noise 0.03 --checkpoint ./checkpoints/ablation2_best.pt
# Ablation study (3)
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 1 --noise 0.01 --ablation3 True
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 1 --noise 0.02 --ablation3 True
python test.py --dataset PUNet --resol 10000_poisson --denoise_iters 2 --noise 0.03 --ablation3 True
We implemented the whole pipeline of denoising network with libraries pytorch
, pytorch3d
, pytorch-cluster
and point_cloud_utils
. We borrowed datasets, hyper-parameter setting and basic skeleton code of denoising model from authors.
@InProceedings{Luo_2021_ICCV,
author = {Luo, Shitong and Hu, Wei},
title = {Score-Based Point Cloud Denoising},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = {October},
year = {2021},
pages = {4583-4592}
}