gt-marine-robotics-group / Virtuoso

ROS2 autonomy architecture for Georgia Tech Marine Robotics. Designed to be modular and work with any combination of sensors + motors.
8 stars 3 forks source link

Mapping YOLO results to Lidar results #150

Open mroglan opened 11 months ago

mroglan commented 11 months ago

Currently, we are relying on the lidar to detect buoys. We want to use the results of our YOLO model to map the buoys located by the lidar to a specific color. The goal is for us to know the location and color of buoys around the USV.

To get the location of buoys from the Lidar, I would recommend using the information published to /perception/lidar/bounding_boxes. This topic receives bounding boxes of objects the lidar detects. Note that not all objects detected are buoys. Currently, we filter out boxes that are too large to be buoys in other nodes (see buoy_lidar.py).

The results of the YOLO model are published to a topic ending in /yolo_results. The prefix of the topic is the left-most camera which changes depending on the USV we are using. Additionally, a topic ending in /yolo_debug publishes debug images showing a bounding box around the buoys detected by the model.

Both the euclidean clustering and YOLO model are launched when setup.launch.py is launched from virtuoso_autonomy.