Influenced from Memory Leak Issue on leggedrobotics/darket_ros repository. While running the ROS Node on Jetson TX1, the node was being killed in some time due to overflow in memory. This repository solves the issue of memory leak implementing multithreading techniques.
Removed Boost as dependency.
Updated darknet version - AlexeyAB/darknet. Improves the neural network performance ~7% by fusing 2 layers into 1: Convolutional + Batch-norm
Implemented Object Detection and Tracking.
Updated darknet_msgs according to bbox_t structure.
Credits - Marko Bjelonic, AlexeyAB
This is a ROS package developed for object detection in camera images. You only look once (YOLO) is a state-of-the-art, real-time object detection system. In the following ROS package you are able to use YOLO (V3) on GPU and CPU. The pre-trained model of the convolutional neural network is able to detect pre-trained classes including the data set from VOC and COCO, or you can also create a network with your own detection objects. For more information about YOLO, Darknet, available training data and training YOLO see the following link: YOLO: Real-Time Object Detection.
The YOLO packages have been tested under ROS Kinetic and Ubuntu 16.04. This is research code, expect that it changes often and any fitness for a particular purpose is disclaimed.
Based on the Pascal VOC 2012 dataset, YOLO can detect the 20 Pascal object classes:
Based on the COCO dataset, YOLO can detect the 80 COCO object classes:
This software is built on the Robotic Operating System ([ROS]), which needs to be installed first. Additionally, YOLO for ROS depends on following software:
In order to install darknet_ros, clone the latest version using SSH from this repository into your catkin workspace and compile the package using ROS.
cd catkin_workspace/src
git clone --recursive https://github.com/pushkalkatara/darknet_ros.git
cd ../
Darknet on the CPU is fast (approximately 1.5 seconds on an Intel Core i7-6700HQ CPU @ 2.60GHz × 8) but it's like 500 times faster on GPU! You'll have to have an Nvidia GPU and you'll have to install CUDA. The CMakeLists.txt file automatically detects if you have CUDA installed or not. CUDA is a parallel computing platform and application programming interface (API) model created by Nvidia. If you do not have CUDA on your System the build process will switch to the CPU version of YOLO. If you are compiling with CUDA, you might receive the following build error:
nvcc fatal : Unsupported gpu architecture 'compute_61'.
This means that you need to check the compute capability (version) of your GPU. You can find a list of supported GPUs in CUDA here: CUDA - WIKIPEDIA. Simply find the compute capability of your GPU and add it into darknet_ros/CMakeLists.txt. Simply add a similar line like
-O3 -gencode arch=compute_62,code=sm_62
The yolo-voc.weights and tiny-yolo-voc.weights are downloaded automatically in the CMakeLists.txt file. If you need to download them again, go into the weights folder and download the two pre-trained weights from the COCO data set:
cd catkin_workspace/src/darknet_ros/darknet_ros/yolo_network_config/weights/
wget http://pjreddie.com/media/files/yolov2.weights
wget http://pjreddie.com/media/files/yolov2-tiny.weights
And weights from the VOC data set can be found here:
wget http://pjreddie.com/media/files/yolov2-voc.weights
wget http://pjreddie.com/media/files/yolov2-tiny-voc.weights
And the pre-trained weight from YOLO v3 can be found here:
wget http://pjreddie.com/media/files/yolov3-voc.weights
wget http://pjreddie.com/media/files/yolov3.weights
In order to use your own detection objects you need to provide your weights and your cfg file inside the directories:
catkin_workspace/src/darknet_ros/darknet_ros/yolo_network_config/weights/
catkin_workspace/src/darknet_ros/darknet_ros/yolo_network_config/cfg/
In addition, you need to create your config file for ROS where you define the names of the detection objects. You need to include it inside:
catkin_workspace/src/darknet_ros/darknet_ros/config/
Then in the launch file you have to point to your new config file in the line:
<rosparam command="load" ns="darknet_ros" file="$(find darknet_ros)/config/your_config_file.yaml"/>
In order to get YOLO ROS: Real-Time Object Detection for ROS to run with your robot, you will need to adapt a few parameters. It is the easiest if duplicate and adapt all the parameter files that you need to change from the darkned_ros
package. These are specifically the parameter files in config
and the launch file from the launch
folder.
This is the main YOLO ROS: Real-Time Object Detection for ROS node. It uses the camera measurements to detect pre-learned objects in the frames.
You can change the names and other parameters of the publishers, subscribers and actions inside darkned_ros/config/ros.yaml
.
/camera_reading
([sensor_msgs/Image])
The camera measurements.
object_detector
([std_msgs::Int8])
Publishes the number of detected objects.
bounding_boxes
([darknet_ros_msgs::BoundingBoxes])
Publishes an array of bounding boxes that gives information of the position and size of the bounding box in pixel coordinates.
detection_image
([sensor_msgs::Image])
Publishes an image of the detection image including the bounding boxes.
camera_reading
([sensor_msgs::Image])
Sends an action with an image and the result is an array of bounding boxes.
You can change the parameters that are related to the detection by adding a new config file that looks similar to darkned_ros/config/yolo.yaml
.
yolo_model/config_file/name
(string)
Name of the cfg file of the network that is used for detection. The code searches for this name inside darkned_ros/yolo_network_config/cfg/
.
yolo_model/weight_file/name
(string)
Name of the weights file of the network that is used for detection. The code searches for this name inside darkned_ros/yolo_network_config/weights/
.
yolo_model/threshold/value
(float)
Threshold of the detection algorithm. It is defined between 0 and 1.
yolo_model/detection_classes/names
(array of strings)
Detection names of the network used by the cfg and weights file inside darkned_ros/yolo_network_config/
.