Codes for Meshing Point Clouds with Predicted Intrinsic-Extrinsic Ratio Guidance (ECCV2020). [paper]
We propose a novel mesh reconstruction method that leverages the input point cloud as much as possible, by predicting which triplets of points should form faces. Our key innovation is a surrogate of local connectivity, calculated by comparing the intrinsic/extrinsic metrics. We learn to classify the candidate triangles using a deep network and then feed the results to a post-processing module for mesh generation. Our method can not only preserve fine-grained details, handle ambiguous structures, but also possess strong generalizability to unseen categories.
a) Environment:
b) Download submodules annoy(1.16) and SparseConvNet(0.2) and install SparseConvNet:
git submodule update --init --recursive
cd SparseConvNet/
sh develop.sh
annoy 1.17 changed their API. Please download the previous version.
c) Install plyfile, pickle, and tqdm with pip.
You can download the pretrained model and demo data from here to get a quick look. Demo data includes ten shapes (both gt mesh and point cloud) and their pre-generated pickle files. The pickle files contain the point cloud vertices and proposed candidate triangles (vertex indices and gt labels). You can use the pickles files to train or test the network.
You can use network/test.py
to classify the proposed candidate triangles. You can find the prediced labels (npy files) at log/shapenet_pretrained/test_demo
. 300,000 triangles per npy file and each shape may have multiple npy files.
You can feed the pickle files and the predicted npy files into a post-process program to get output meshes.
First, compile cpp codes:
cd postprocess
mkdir build
cd build
cmake ..
make
cd ..
Then, you can post-process all the demo shapes with run_demo.py
or post-process a single shape with main.py
. You can find the generated demo meshes at log/shapenet_pretrained/test_demo/output_mesh
.
You can download all the pickle files for the full ShapeNet dataset from here(23,108 shapes, ~42.2GB). Then use network/train.py
to train your own network.
You can generate your own training data with gt mesh (ply).
First, compile the cpp code:
cd preprocess_with_gt_mesh
mkdir build
cd build
cmake ..
make
cd ..
Then, you can use main.py
to generate the picke file for a single shape or use run_demo.py
to generate the pickle files for all the demo meshes. The total runtime for each shape may take several minutes. You can use multiple processes to accelerate.
In detail, the training data generation consists of several steps:
You can also generate pickle files with only point clouds (ply), so that you can feed the pickle files into the network and the postprocess program to get the final mesh.
First, compile the cpp code:
cd preprocess_with_pc
mkdir build
cd build
cmake ..
make
cd ..
Then, you can use main.py
to generate the picke file for a single shape or use run_demo.py
to generate the pickle files for all the demo point clouds. The total runtime for each shape may take less than one minute. You can use multiple processes to accelerate. Please note that, in this way, the candidate labels in the pickle files will be set to -1.
The input point cloud should contain 12,000 ~ 12,800 points (to best fit our pre-trained network). Using Poisson sampling as pre-processing can get evenly distributed point cloud and thus boost the performance. Currently, our method do not support very noisy point clouds.
If you find our work useful for your research, please cite:
@article{liu2020meshing,
title={Meshing Point Clouds with Predicted Intrinsic-Extrinsic Ratio Guidance},
author={Liu, Minghua and Zhang, Xiaoshuai and Su, Hao},
journal={arXiv preprint arXiv:2007.09267},
year={2020}
}