The goal of this project is to build a mobile robot system in both simulation and real-world environments to achieve SLAM, autonomous navigation and pedestrian detection with Velodyne VLP-16 Lidar sensor and Clearpath Jackal UGV.
Simulation is established for the Jackal and Velodyne VLP-16 Lidar in Gazebo and rviz.
The package robot_localization is used to fuse Odometry with IMU data through Extended Kalman Filter. It estimates the robot state and provides the transformation between tf frames /odom and /base_link.
The raw Lidar PointCloud is processed in C++ using PCL library. The following steps are implemented:
I used the 2D SLAM package gmapping, which implements the Rao-Blackwellized Particle Filter to generate a 2D grip costmap and the the localizaion of the robot in the map (transformation between tf frames /map and /odom).
I used package move_base to achieve autonomous navigation of Jackal. It reads the map and localization from gmapping and plans a local and global path for the robot.
I used package hdl_people_tracking to achieve pedestrian detection and tracking with Velodyne Lidar. It generate semantic segmentation of the human clusters in the PointCloud.
The Jackal packages are released in ROS Indigo and Kinetic. To use them in ROS Melodic, the following compiling processes are implemented:
Install in Terminal with sudo apt-get install:
Git clone the original Github repo to local catkin workspace, and run catkin_make.
roslaunch winter_project simulation.launch
roslaunch winter_project real_jackal.launch
export ROS_MASTER_URI=http://CPR-J100-0076.local:11311
export ROS_HOSTNAME=robostation.local
roslaunch winter_project real_pc.launch
rosbag record
scp -r rosbag.bag ethan@robostation.local:~/Downloads
nmcli connection up JACKALROUTER24
ssh administrator@192.168.0.100
Password: clearpath
nmcli connection up Jackal
ssh administrator@cpr-j100-0076.local
sudo sixpair
sudo sixad --boot-yes
sudo rm -r winter_project/
scp -r /home/ethan/jackal_ws/src/winter_project/ administrator@cpr-j100-0076.local:~/chenge_ws/src
The problem is caused by the default settings of robot_localization package. It can be solved by changing the parameters for EKF in the following files:
The problem is caused by the noise measurements of IMU. The roll and pitch measurements have gravity as the absolute reference (measured by accelerometer), while the yaw measurement does not. Thus, the magnetometer is enabled for package imu_filter_madgwick to provide the absolute yaw reference. The launch file is located at:
VLP-16.urdf.xacro configuration file for the Velodyne VLP16 Lidar in simulation. Change the param samples at the beginning of the file from 1875 to 200. This will significantly improve the FPS of the package in rviz.
~/jackal_ws/src/velodyne/velodyne_pointcloud/launch/VLP16_points.launch configuration file in Jackal for the velodyne_pointcloud package that transforms the raw Lidar data to ROS message sensor_msgs::PointCloud2. Change the param rpm from 600 to 300. This reduces the publish frequency of topic /velodyne_points, thus allows the Velodyne VLP16 to sample the surrounding environment for all 360 degrees. This setting solves the issue in rviz that the pointcloud is blinking all the time while each frame contains incomplete surrounding information.
The PCL used in Jackal is version 1.7. Running the node written with PCL will cause a segmentation fault (core dumped). This is because PCL 1.7 does not support C++11. Therefore, I commented the line in CMakeLists.txt that specifying to compile using C++11. Also, the following commands are used in the debugging process:
sudo apt update
sudo apt install gdb
rosrun --prefix 'gdb --args' winter_project floor_removal