An online volumetric mapping-based approach for real-time detection of diverse dynamic objects in complex environments.
Credits
Setup
Examples
If you find this package useful for your research, please consider citing our paper:
@article{schmid2023dynablox,
title={Dynablox: Real-time Detection of Diverse Dynamic Objects in Complex Environments},
author={Schmid, Lukas, and Andersson, Olov, and Sulser, Aurelio, and Pfreundschuh, Patrick, and Siegwart, Roland},
booktitle={IEEE Robotics and Automation Letters (RA-L)},
year={2023},
volume={8},
number={10},
pages={6259 - 6266},
doi={10.1109/LRA.2023.3305239}}
}
A brief overview of the problem, approach, and results is available on youtube:
We were excited to learn that Dynablox has been integrated into NVIDIA's nvblox, where the algorithm's parallelism can make fantastic use of the GPU and detect moving objects fast and at high resolutions!
There is a docker image available for this package. Check the usage in the dockerhub page.
If not already done so, install ROS. We recommend using Desktop-Full
.
If not already done so, setup a catkin workspace:
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws
catkin init
catkin config --extend /opt/ros/$ROS_DISTRO
catkin config --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo
catkin config --merge-devel
Install system dependencies:
sudo apt-get install python3-vcstool python3-catkin-tools ros-$ROS_DISTRO-cmake-modules protobuf-compiler autoconf git rsync -y
Clone the repo using SSH Keys:
cd ~/catkin_ws/src
git clone git@github.com:ethz-asl/dynablox.git
Install ROS dependencies:
cd ~/catkin_ws/src
vcs import . < ./dynablox/ssh.rosinstall --recursive
Build:
catkin build dynablox_ros
To run the demos we use the Urban Dynamic Objects LiDAR (DOALS) Dataset. To download the data and pre-process it for our demos, use the provided script:
roscd dynablox_ros/scripts
./download_doals_data.sh /home/$USER/data/DOALS # Or your preferred data destination.
We further collect a new dataset featuring diverse dynamic objects in complex scenes. The full dataset and description ca nbe found here. To download the processed ready-to-run data for our demos, use the provided script:
roscd dynablox_ros/scripts
./download_dynablox_data.sh /home/$USER/data/Dynablox # Or your preferred data destination.
If not done so, download the DOALS dataset as explained here.
Adjust the dataset path in dynablox_ros/launch/run_experiment.launch
:
<arg name="bag_file" default="/home/$(env USER)/data/DOALS/hauptgebaeude/sequence_1/bag.bag" />
Run
roslaunch dynablox_ros run_experiment.launch
You should now see dynamic objects being detected as the sensor moves through the scene:
If not done so, download the Dynablox dataset as explained here.
Adjust the dataset path in dynablox_ros/launch/run_experiment.launch
and set use_doals
to false:
<arg name="use_doals" default="false" />
<arg name="bag_file" default="/home/$(env USER)/data/Dynablox/processed/ramp_1.bag" />
Run
roslaunch dynablox_ros run_experiment.launch
You should now see dynamic objects being detected as the sensor moves through the scene:
If not done so, download the DOALS dataset as explained here.
Adjust the dataset path in dynablox_ros/launch/run_experiment.launch
:
<arg name="bag_file" default="/home/$(env USER)/data/DOALS/hauptgebaeude/sequence_1/bag.bag" />
In dynablox_ros/launch/run_experiment.launch
, set the evaluate
flag, adjust the ground truth data path, and specify where to store the generated outpuit data:
<arg name="evaluate" default="true" />
<arg name="eval_output_path" default="/home/$(env USER)/dynablox_output/" />
<arg name="ground_truth_file" default="/home/$(env USER)/data/DOALS/hauptgebaeude/sequence_1/indices.csv" />
Run
roslaunch dynablox_ros run_experiment.launch
Wait till the dataset finished processing. Dynablox should shutdown automatically afterwards.
Printing the Detection Performance Metrics:
roscd dynablox_ros/src/evaluation
python3 evaluate_data.py /home/$USER/dynablox_output
1/1 data entries are complete.
Data object_IoU object_Precision object_Recall
hauptgebaeude_1 89.8 +- 5.6 99.3 +- 0.4 90.3 +- 5.6
All 89.8 +- 5.6 99.3 +- 0.4 90.3 +- 5.6
Inspecting the Segmentation:
roslaunch dynablox_ros cloud_visualizer.launch file_path:=/home/$USER/dynablox_output/clouds.csv
Inspecting the Run-time and Configuration:
Additional information is automatically stored in timings.txt
and config.txt
for each experiment.
Adding Drift to an Experiment:
To run an experiment with drift specify one of the pre-computed drift rollouts in dynablox_ros/launch/run_experiment.launch
:
<arg name="drift_simulation_rollout" default="doals/hauptgebaeude/sequence_1/light_3.csv" />
All pre-computed rollouts can be found in drift_simulation/config/rollouts
. Note that the specified sequence needs to match the data being played. For each sequence, there exist 3 rollouts for each intensity.
Alternatively, use the drift_simulation/launch/generate_drift_rollout.launch
to create new rollouts for other datasets.
Changing th Configuration of Dynablox:
All parameters that exist in dynablox are listed in dynablox_ros/config/motion_detector/default.yaml
, feel free to tune the method for your use case!