Created by Huizong Yang*, Yuxin Sun*, Ganesh Sundaramoorthi and Anthony Yezzi from Georgia Tech and Raytheon Technologies.
This is the code for training shape INRs for 3D surface reconstructions from point cloud using our new second order regularization and new shape representation. It allows to train, test and evaluate the tasks of surface reconstruction.
Please follow the installation instructions below.
Our codebase uses PyTorch.
The code was tested with Python 3.9.12, torch 1.8.2, tensorboardX 2.3, CUDA 11.7 on Red Hat 4.8.5-44 (should work with later versions).
For a full list of requirements see the requirement.txt
file. Note we also use plotly-orca
for visualisation, which needs to be installed from conda.
Example installation code (should install PyTorch separately):
conda create -n steik python=3.9.12
conda activate steik
conda install pip # for using pip commands in the conda env
pip install -r requirements.txt
conda install -c plotly plotly plotly-orca # conda only
# Install with instructions from https://pytorch.org/get-started/locally/
# Below is instructions for installation of long term support (1.8.2 at the time).
conda install pytorch torchvision torchaudio cudatoolkit=11.1 -c pytorch-lts -c nvidia
# for CUDA 10.2: conda install pytorch torchvision torchaudio cudatoolkit=10.2 -c pytorch-lts
We implemented a 2D shape dataset generator (./sanitychecks/basic_shape_dataset2d.py
) that includes three shapes: Circle, L shape polygon, and Koch snowflake. The code generally allows any polygonal shape to be used and can be extended to other 2D shapes.
To train a 2D shape neural representation and reconstruct the surface (curve in this case) for all three shapes run the script
cd sanitychecks
./scripts/run_train_test_basic_shape.sh
To visualize the MFGI and geometric initializations run ./sanitychecks/scripts/visualize_initializations.sh
The Surface Reconstruction Benchmark (SRB) data is provided in the Deep Geometric Prior repository.
This can be downloaded via terminal into the data directory by running data/scripts/download_srb.sh
(1.12GB download). We use the entire dataset (of 5 complex shapes).
If you use this data in your research, make sure to cite the Deep Geometric Prior paper.
We use a subset of the ShapeNet data as chosen by Neural Splines. This data is first preprocessed to be watertight as per the pipeline in the Occupancy Networks repository, who provide both the pipleline and the entire preprocessed dataset (73.4GB).
The Neural Spline split uses the first 20 shapes from the test set of 13 shape classes from ShapeNet. We provide a subset of the ShapeNet preprocessed data (the subset that corresponds to the split of Neural Splines) and the resulting point clouds for that subset. These can be downloaded via terminal into the data directory by running data/scripts/download_shapenet.sh
(783.76MB download).
If you use this data in your research, make sure to cite the ShapeNet and Occupancy Network papers, and if you report on this split, compare and cite to the Neural Spline paper.
For scene reconstruction, we used the scene from the SIREN paper. This can be downloaded via terminal into the data directory by running data/scripts/download_scene.sh
(56.2MBMB download).
If you use this data in your research, make sure to cite the SIREN paper.
To train, test and evaluate on SRB run
./surface_reconstruction/scripts/run_surf_recon_exp.sh
Similarly we provide a script for ShapeNet:
./surface_reconstruction/scripts/run_shapenet_recon.sh
and for scene reconstruction
./surface_reconstruction/scripts/run_scene_recon_exp.sh
These scripts have bash variables for changing the input, major hyperparameters, and where saves/logs/meshes are made.
Thanks to the DiGS codebase off whom we built upon.
Supported in part by Army Research Office (ARO) W911NF-22-1-0267 and by the Intelligence Advanced Research Projects Activity (IARPA) via Department of Interior/ Interior Business Center (DOI/IBC) contract number 140D0423C0075. The U.S. Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright annotation thereon. Disclaimer: The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of IARPA, DOI/IBC, or the U.S. Government.
If you find our work useful in your research, please cite our paper:
@misc{yang2023steik,
title={StEik: Stabilizing the Optimization of Neural Signed Distance Functions and Finer Shape Representation},
author={Huizong Yang and Yuxin Sun and Ganesh Sundaramoorthi and Anthony Yezzi},
year={2023},
eprint={2305.18414},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
See LICENSE file.