The main goal of this build is to implement SLAM using 3D point cloud (for ADS-DV) and 2D laser (for small-scale car). Odometry sources are also provided by these sensors, to replace or compliment wheel-based odometry.
General Software Pipeline: Perception -> Cone Recognition + Cluster Detection -> Cone Mapping -> SLAM (optional for simulation) -> Path Planning -> Path Follower
Major Features:
3D Graph SLAM
2D Laser SLAM
3D lidar odom
2D lidar odom
Cone distance check function: instead of relying on limiting point cloud distance cut-off to ignore cones that are far (only works in simulation), a function computes distance of cones from robot and reject those out of the defined range (works in both simulation and real world)
Implemented skidpad routine, provided by the path planning library
Minor Features:
Tweaked /cmd_vel_to_ackermann_drive wheelbase configuration to fit ADS-DV and small-scale car
ARM64 patch for hdl_graph_slam (Frank's solution)
TODO:
small-scale car communication WIP
lap counting
mission specific behaviours
ADS-DV API
Instructions:
Autonomous System Launch Sequence (full-scale ADS-DV model, with absolute odometry):
Go to gra/.devcontainer/docker-compose.yml and comment the deploy section, then rebuild the Docker image in VScode
Go to gra/ros/cone_detection/launch/tracker_with_cloud.launch
Comment line 17 and uncomment line 15
This step reduces Gazebo CPU usage by scaling the simulation time. The simulator will appear slower but CPU load will be reduced significantly.
Go to gra/ros/ackermann_vehicle_gazebo/worlds/gazebo_world_building/track_small_features.world
Line 76: 700
Reduce the value to reduce CPU load. 1000 is the default value where the simulation time unit = real time unit. When the value is 500, 1 simulation second = 2 real time second.
Start with 700 and reduce gradually if the cone recognition process is unable to stablise. Unstable process can be observed by the cones swerving in Rviz when the car is steering.
Note to self: add these instructions to README or wiki
The main goal of this build is to implement SLAM using 3D point cloud (for ADS-DV) and 2D laser (for small-scale car). Odometry sources are also provided by these sensors, to replace or compliment wheel-based odometry. General Software Pipeline: Perception -> Cone Recognition + Cluster Detection -> Cone Mapping -> SLAM (optional for simulation) -> Path Planning -> Path Follower
Major Features:
Minor Features:
TODO:
Instructions:
Autonomous System Launch Sequence (full-scale ADS-DV model, with absolute odometry):
Do the following to disable GPU Acceleration:
Note to self: add these instructions to README or wiki