Thank you for the awesome work, I am try to reimplement the repo, but I have some question and want to ask, please give me some advise, thank you.
I run the repo with my VLP 16 LiDAR and run the roslaunch object3d_detector object3d_detector.launch. I have successfully dectected the human, but I can't get the human trajectory. I don't know where is the error. The following picture is the running screen, the rostopic list and rosrun rqt_graph rqt_graph.
How is the UKF works? It just use the position (x, y) get from LiDAR or how to get the velocity value from LiDAR sensor?
Hi, we actually get the object velocity from the tracker instead of directly from the lidar data. And the tracker obtains the 2D position (x,y) of the object from the detector.
Hello,
Thank you for the awesome work, I am try to reimplement the repo, but I have some question and want to ask, please give me some advise, thank you.
roslaunch object3d_detector object3d_detector.launch
. I have successfully dectected the human, but I can't get the human trajectory. I don't know where is the error. The following picture is the running screen, therostopic list
androsrun rqt_graph rqt_graph
.Thank you for your help. Best Regards.