ethz-asl / rovio

Other
1.12k stars 506 forks source link

ROVIO with kinect and IMU, how to synchronize #169

Closed npal1008 closed 6 years ago

npal1008 commented 6 years ago

Hi, First of all thanks for this amazing piece of code. I am trying to use ROVIO with a Kinect v2 and a Tinkerforge IMU Brick 2.0. Changing some parameters I managed to achieve a relatively good and stable odometry but I think that it can be better. Since both the Kinect and the IMU sends data through USB connection I am having a very hard time to achieve a good synchronization between images and IMU messages (I think it's impossible to do hardware synchronization). Do you know how can I synchronize this streams of data in order to improve my odometry?

All tips are welcome and thank you in advance!!!!

ZacharyTaylor commented 6 years ago

I'm actually quite impressed you got rovio up and running over two separate usb devices. For improving timesync, I guess you will need to look a bit at your hardware and its ros drivers.

By default most ros sensor drivers just timestamp the data on arrival with ros::Time::now(). However, this will not take into account the delay due to the usb or any processing time for the image on-board the sensor. These delays seem to depend a lot on the sensor but are usually in the range of 10 to 100 ms.

There are a couple of different cases with different options for improving timesync: 1) The sensor allows two way timesyncing In this case your sensor supports sending and receiving specialized timesync messages that can be used to estimate and remove such delays. Implementing something like https://en.wikipedia.org/wiki/Precision_Time_Protocol. This is the perfect case, but it is incredibly rare that a commercial sensor gives this functionality.

2) The sensor stamps its output with its internal clock. A far more common case. Many sensors output when the readings were made according to their own internal clock. Unfortunately without two way communication, there is not enough information to completely sync the sensors. However, you can get it down to a fairly static offset that you can then estimate with a tool like Kalibr. @HannesSommer wrote a library for working with these time offsets and can be found here https://github.com/ethz-asl/cuckoo_time_translator

3) The sensor accepts an external trigger If it does you could use a micro-controller synced to the pc using 1) to trigger the sensor data and provide its timestamps.

4) No timestamps, just sensor data Here the best you can really do is timestamp on arrival. You can maybe remove a bit of jitter if your sensor outputs at a constant rate and you feed the arrival times through a Kalman filter. Again you can use Kalibr to find a static offset you can then remove, but you will always have timing issues.

npal1008 commented 6 years ago

Thank you very much for your response! With this hardware I think that the only possible way is the number 4. Since the output rate is more or less constant how can I remove the jitter as you said?

ZacharyTaylor commented 6 years ago

Sorry for taking a while to get back to you. The basics of it would be just to have a filter with a state that is the timestamp, a prediction step that increases it by the known time between camera frames and then take the input frame timestamps as measurements.

I have actually been getting quite a few questions about timesync in a couple of projects lately so decided to try write something to sort it. See https://github.com/ethz-asl/time_autosync This repo also tries to estimate the absolute timestamp online. I just put it together today so while it at least seems to do something sensible when I run it on the euroc dataset I would expect bugs if you try to use it.