UTS-RI / IDMP

Interactive Distance Field Mapping and Planning to Enable Human-Robot Collaboration
GNU General Public License v3.0
21 stars 1 forks source link

IDMP - Interactive Distance Field Mapping and Planning to Enable Human-Robot Collaboration

This is the code for IDMP, implementing a gaussian process based distance and gradient field algorithm.

This codebase is implemented using ROS in both C++ and Python.

Description

We present an interactive distance field mapping and planning (IDMP) framework that handles dynamic objects and collision avoidance through an efficient representation. Given depth sensor data, our framework builds a continuous field that allows to query the distance and gradient to the closest obstacle at any required position. The key aspect of this work is an efficient Gaussian Process field that performs incremental updates and implicitly handles dynamic objects with a simple and elegant formulation based on a temporary latent model.

Dependencies

Installation

IDMP is provided as catkin package

Running IDMP

Launch Files

We provide a few launch files to get started:

You can run these with roslaunch idmp_ros {name}.launch.

The camera pose has to be published as a TF transform. This is already done in the launch files. To change the pose, adjust the corresponding parameters in the static transform publisher args

Running IDMP standalone

If you do not want to use a launch file, you can also run the IDMP node standalone

rosrun idmp_ros idmp

Keep in mind that it needs the parameters of the config file in the parameter server at launch. You can lode these with rosparam load {file}.yaml.

These config also contains the topics of the input pointcloud, the camera intrinsics and the reference frame. Make sure to adjust these to your setup.

Running Benchmark Datasets

The Datasets can be found here:

Running the dataset with IDMP:

Using multiple cameras

Idmp supports the use of multiple cameras to capture a bigger scene. For this, each camera has to publish its pose as TF and its intrinsic parameters as CameraInfo message. The pointclouds of the cameras need to be merged into one pointcloud (preferably in the base frame), before being fed into IDMP.

Tools

We provide a set of tools to interact with IDMP for testing and demonstration purposes. You can run these with `rosrun idmp_ros {tool}.py

Parameters

All IDMP parameters can be found as yaml files in the config folder. These need to be loaded into the parameter server before launching IDMP.

Parameter Description Recommended Value
idmp_rleng Cluster overlap factor for GP training 1.8
idmp_tree_hl_min Octree half-length min 0.025
idmp_tree_hl_max Octree half-length max 3.2
idmp_tree_hl_clust Octree half-length of cluster 0.05
idmp_tree_hl_init Initial octree half-length 1.6
idmp_map_scale Kernel map scale 1024
idmp_filt_outl Enables outlier filtering for input True
idmp_pub_pcl Publish the pointcloud of the octree for visualization True
idmp_world_frame Reference frame (needs to be published as TF) "base_link"
idmp_pcl_topic Topic of input pointcloud "/points_filt"
idmp_caminfo_topic List of cameraInfo and tf pose pairs for every sensor " - ["/depth/camera_info", "depth_camera_link"]"
idmp_dynamic Enables dynamic object mode True
idmp_fusion Enables local fusion False
idmp_dyn_tresh Dynamic movement treshold 0.08
idmp_fus_min Fusion minimum treshold 0.0005
idmp_fus_max Fusion maximum treshold 0.01