xinge008 / Cylinder3D

Rank 1st in the leaderboard of SemanticKITTI semantic segmentation (both single-scan and multi-scan) (Nov. 2020) (CVPR2021 Oral)
Apache License 2.0
857 stars 180 forks source link
cvpr2021 lidar-point-cloud lidar-segmentation nuscenes panoptic-segmentation point-cloud semantic-segmentation semantickitti

Cylindrical and Asymmetrical 3D Convolution Networks for LiDAR Segmentation

The source code of our work "Cylindrical and Asymmetrical 3D Convolution Networks for LiDAR Segmentation img|center

News

Installation

Requirements

Data Preparation

SemanticKITTI

./
├── 
├── ...
└── path_to_data_shown_in_config/
    ├──sequences
        ├── 00/           
        │   ├── velodyne/   
        |   |   ├── 000000.bin
        |   |   ├── 000001.bin
        |   |   └── ...
        │   └── labels/ 
        |       ├── 000000.label
        |       ├── 000001.label
        |       └── ...
        ├── 08/ # for validation
        ├── 11/ # 11-21 for testing
        └── 21/
        └── ...

nuScenes

./
├── 
├── ...
└── path_to_data_shown_in_config/
        ├──v1.0-trainval
        ├──v1.0-test
        ├──samples
        ├──sweeps
        ├──maps

Training

  1. modify the config/semantickitti.yaml with your custom settings. We provide a sample yaml for SemanticKITTI
  2. train the network by running "sh train.sh"

Training for nuScenes

Please refer to NUSCENES-GUIDE

Pretrained Models

-- We provide a pretrained model for SemanticKITTI LINK1 or LINK2 (access code: xqmi)

-- For nuScenes dataset, please refer to NUSCENES-GUIDE

Semantic segmentation demo for a folder of lidar scans

python demo_folder.py --demo-folder YOUR_FOLDER --save-folder YOUR_SAVE_FOLDER

If you want to validate with your own datasets, you need to provide labels. --demo-label-folder is optional

python demo_folder.py --demo-folder YOUR_FOLDER --save-folder YOUR_SAVE_FOLDER --demo-label-folder YOUR_LABEL_FOLDER

TODO List

Reference

If you find our work useful in your research, please consider citing our paper:

@article{zhu2020cylindrical,
  title={Cylindrical and Asymmetrical 3D Convolution Networks for LiDAR Segmentation},
  author={Zhu, Xinge and Zhou, Hui and Wang, Tai and Hong, Fangzhou and Ma, Yuexin and Li, Wei and Li, Hongsheng and Lin, Dahua},
  journal={arXiv preprint arXiv:2011.10033},
  year={2020}
}

#for LiDAR panoptic segmentation
@article{hong2020lidar,
  title={LiDAR-based Panoptic Segmentation via Dynamic Shifting Network},
  author={Hong, Fangzhou and Zhou, Hui and Zhu, Xinge and Li, Hongsheng and Liu, Ziwei},
  journal={arXiv preprint arXiv:2011.11964},
  year={2020}
}

Acknowledgments

We thanks for the opensource codebases, PolarSeg and spconv