csiro-robotics / Wild-Places

🏞️ [IEEE ICRA2023] The official repository for paper "Wild-Places: A Large-Scale Dataset for Lidar Place Recognition in Unstructured Natural Environments" To appear in 2023 IEEE International Conference on Robotics and Automation (ICRA)
https://csiro-robotics.github.io/Wild-Places/
78 stars 3 forks source link
benchmark dataset deep-learning lidar loop-closure natural-environments place-recognition point-cloud

Wild-Places: A Large-Scale Dataset for Lidar Place Recognition in Unstructured Natural Environments

Website | Paper | Data Download Portal

This repository contains the code implementation used in the paper Wild-Places: A Large-Scale Dataset for Lidar Place Recognition in Unstructured Natural Environments, which has been accepted for publication at ICRA2023.

If you find this dataset helpful for your research, please cite our paper using the following reference:

@inproceedings{2023wildplaces,
  title={Wild-Places: A Large-Scale Dataset for Lidar Place Recognition in Unstructured Natural Environments},
  author={Knights, Joshua and Vidanapathirana, Kavisha and Ramezani, Milad and Sridharan, Sridha and Fookes, Clinton and Moghadam, Peyman},
  year={2023},
  booktitle={IEEE International Conference on Robotics and Automation (ICRA)},
  eprint={arXiv preprint arXiv:2211.12732}
}

Contents

  1. Updates
  2. Download Instructions
  3. Scripts
  4. Thanks

1. Updates

2. Download Instructions

Our dataset can be downloaded through The CSIRO Data Access Portal. Detailed instructions for downloading the dataset can be found in the README file provided on the data access portal page.

3. Scripts

3.1 Environment

To create a python environment to use the scripts in this repository run the following command:

conda env create -f scripts/Wild-Places.yaml -n Wild-Places

3.2 Loading Point Clouds

A code snippet to load a pointcloud file from our dataset can be found in eval/load_pointcloud.py

3.2 Generating Training & Testing Splits

In this repository we provide several scripts for partitioning our dataset into splits for training and evaluation.
The output of these scripts are pickle files containing training and evaluation splits in a format compatible with existing repositories such as PointNetVLAD, MinkLoc3D(v2), TransLoc3D and PPT.

Training

To generate the training splits run the following command:

python scripts/generate_splits/training_sets.py --dataset_root $_PATH_TO_DATASET --save_folder --$_SAVE_FOLDER_PATH

Where $_PATH_TO_DATASET is the path to the downloaded dataset, and $_SAVE_FOLDER_PATH is the path to the directory where the generated files will be saved.

Testing

To generate the testing splits run the following command:

python scripts/generate_splits/testing_sets.py --dataset_root $_PATH_TO_DATASET --save_folder --$_SAVE_FOLDER_PATH

This script will generate seperate testing pickles for the inter-run and intra-run evaluation modes on each environment. The inter-run pickles will produce query and database files for each testing environment, while the intra-run pickles will produce a seperate training pickle for each individual point cloud sequence.

3.3 Evaluation

We provide evaluation scripts for both inter and intra-run evaluation on our dataset.

Inter-run Evaluation

To perform inter-run evaluation on the Wild-Places dataset, run the following command:

python eval/inter-sequence.py \
    --queries $_PATH_TO_QUERIES_PICKLES \
    --databases $_PATH_TO_DATABASES_PICKLES \
    --query_features $_PATH_TO_QUERY_FEATURES \ 
    --database_features $_PATH_TO_DATABASE_FEATURES \
    --location_names $_LOCATION_NAMES \

Where:

Intra-run Evaluation

To perform intra-run evaluation on the Wild-Places dataset, run the following command:

python eval/intra-sequence.py \
    --databases $_PATH_TO_DATABASES_PICKLES \
    --database_features $_PATH_TO_DATABASE_FEATURES \
    --run_names $_LOCATION_NAMES \

Where:

4. Thanks

Special thanks to the authors of the PointNetVLAD and MinkLoc3D, whose excellent code was used as a basis for the generation and evaluation scripts used in this repository.