Our autonomous ground vehicle uses Frontier Based exploration to navigate and map unknown environments. Equipped with sensors, it can avoid obstacles and make real-time decisions. It has potential applications in search and rescue, agriculture, and logistics, and represents an important step forward in autonomous ground vehicle development.
MIT License
45
stars
19
forks
source link
Cartographer compatibility fix and Odometry frame drift fixes #5
I would like to address fixes I made when utilising this autonomous exploration algorithm in a project with a real-life robot and LiDAR:
Using Cartographer with this exploration algorithm
For those who are attempting to use Cartographer instead of slam_toolbox, Cartographer operates differently in the sense that it builds up obstacle boundaries on the map based on confidence levels (percentage), hence why you see obstacles grow and fade in Rviz when you use Cartographer. This causes issues since this codebase requires a bitmap of 0s and 1s, and Cartographer requires scans of a certain object for a long time before it has 100% confidence of the object existing. This leads to the algorithm never being able to discover new frontiers.
FIX: If you want to use Cartographer because your LiDAR or setup does not work well with slam_toolbox, then you should create a ROS package which remaps the /map topic published by Cartographer to a /remap topic, and supply the /remap topic to this algorithm. The ROS package should operate on logic similar to: "If Cartographer's confidence level of this obstacle is >50%, then map it as a 1, otherwise map it as a 0".
Odometry frame inaccuracy
In Turtlebot simulation, everything is usually ideal, so data such as the odometry frame is perfect without drift. In real life, the odometry frame might drift, instead it is used as a relative reference by the base_link of the robot to achieve localisation of the robot's position in the map (tf_tree: map -> odom -> base_link). Supplying the odometry frame to this algorithm will lead to issues where the robot thinks that it is out of the map boundary, or that it is already in a wall. This was found out when I attempted to write my own bitmap-based frontier exploration algorithm because I couldn't get this algorithm to work.
FIX: Instead of providing your /odom topic to the algorithm, attempt to extract your base_link frame and publish it as a /base_link topic. Usually this frame is not extractable, but you can find it by taking the /map topic and applying the transformation of map->base_link, since the transformation is usually exposed and extractable. Then, providing the base_link frame in replacement of the odometry frame to this algorithm will allow the frontier exploration to run smoothly.
I would like to address fixes I made when utilising this autonomous exploration algorithm in a project with a real-life robot and LiDAR:
For those who are attempting to use Cartographer instead of slam_toolbox, Cartographer operates differently in the sense that it builds up obstacle boundaries on the map based on confidence levels (percentage), hence why you see obstacles grow and fade in Rviz when you use Cartographer. This causes issues since this codebase requires a bitmap of 0s and 1s, and Cartographer requires scans of a certain object for a long time before it has 100% confidence of the object existing. This leads to the algorithm never being able to discover new frontiers.
FIX: If you want to use Cartographer because your LiDAR or setup does not work well with slam_toolbox, then you should create a ROS package which remaps the /map topic published by Cartographer to a /remap topic, and supply the /remap topic to this algorithm. The ROS package should operate on logic similar to: "If Cartographer's confidence level of this obstacle is >50%, then map it as a 1, otherwise map it as a 0".
In Turtlebot simulation, everything is usually ideal, so data such as the odometry frame is perfect without drift. In real life, the odometry frame might drift, instead it is used as a relative reference by the base_link of the robot to achieve localisation of the robot's position in the map (tf_tree: map -> odom -> base_link). Supplying the odometry frame to this algorithm will lead to issues where the robot thinks that it is out of the map boundary, or that it is already in a wall. This was found out when I attempted to write my own bitmap-based frontier exploration algorithm because I couldn't get this algorithm to work.
FIX: Instead of providing your /odom topic to the algorithm, attempt to extract your base_link frame and publish it as a /base_link topic. Usually this frame is not extractable, but you can find it by taking the /map topic and applying the transformation of map->base_link, since the transformation is usually exposed and extractable. Then, providing the base_link frame in replacement of the odometry frame to this algorithm will allow the frontier exploration to run smoothly.
Hopefully someone finds this helpful!