Closed sktometometo closed 3 years ago
@sktometometo
where is the node to publish /spot_recognition/bbox_array, /spot_recognition/tracking_labels
?
Cc: @tongtybj
@sktometometo
Please add the node of human tracking in your launch file, and I strongly recommend to clean up you commits. Comments like "fix bugs", "update" are trivival. Please specify what kind of bugs or updates you address. Also please add description for this PR.
BTW, I am very interesting with this demo. Can I try today or tomorrow?
@k-okada @tongtybj
This demo requires outputs of rect_array_in_panorama_to_bounding_box_array node
in object_detection_and_tracking.launch
and deep_sort_tracker_node.py
in multi_object_tracker.launch
. Currently, both launch files are in jsk_spot_startup
packgae.
Sorry for dirty commit history, this is a develop branch. Now this demo works. So I will clean up commits and add descriptions.
I will go to the lab in today's evening.
@sktometometo
I also plan to go to lab today. Can you show this demo at that time?
@tongtybj Yes, I can show you current demo. No problem.
@sktometometo do you have a bag file for this demo? @tongtybj is creating tracking node and he will create sample launch files that publish 3D person / human / cara trajectory.
I think rosbag files in this grive directory can be used.
In order to reproduce recognition fuction of Spot, you need 2 ROS workspaces because these rosbag files only contains outputs of sensor data and currently object detection is done by coral_usb_ros
First one can be created with rosinstall file in my PR ( README.md file describes how to create a workspace for spot).
Second one have to include coral_usb_ros, jsk_robot with my PR, jsk_perception with my PR about rect_array_in_panorama_to_bounding_box_array
.
In order to play, please run three launch files.
# In the first workspace
roslaunch jsk_spot_startup play.launch rosbag:=<absolute path to rosbag>
# In the second workspace
roslaunch jsk_spot_startup object_detection_and_tracking.launch
# In the first workspace
roslaunch jsk_spot_startup multi_object_tracker.launch
The third launch file requires CUDA and chainer <= 6.7.0
@sktometometo
Thank you for your kind instruction.
So, from my understanding, object detection (from coral_usb_ros) is used to provide the first bounding box to feed and start deep_sorted_tracker, right? And, why you need two ros worksapce, it that coral_sub_ros is based on python3, right?
I am developing a new tracker: https://github.com/tongtybj/detr, and I am going to try this new tracker with your rosbag data first. Hope my new tracker can outperform the deep_sort_tracker.
@tongtybj rect_array_in_panorama_to_bounding_box_array and deep_sort_tracker runs independently since deep_sort_tracker use only object detection’s result in 2D image plane. Integration of multi object tracking and 3D geometric information is done in this demo.
and yes, coral_usb_ros requires python3 and we needs to create 2 ROS workspaces
@sktometometo I think you can create bag file that includes images and /edgetpu_human_pose_estimator/output/poses
, so that @tongtybj can skip workspace settings.
@k-okada Ok I wilv creat it later
This demo enables Spot to follow person.
https://user-images.githubusercontent.com/9410362/112797926-1ae3b180-90a7-11eb-892a-21e9df53716e.mp4
Prerequities
This demo requires packages below.
How to run
Before running this demo, please launch and prepair a controller.
jsk_spot_bringup.launch
object_detection_and_tracking.launch
multi_object_detector.launch
And then, please run
After this, you can start following behavior by pressing L2 button of the controller. Spot will follow the nearest person at the time of pressing. If you want to stop the behavior, please press the L2 button again.