ApolloAuto / apollo

An open autonomous driving platform
Apache License 2.0
24.92k stars 9.66k forks source link

The question about the time synchronization in apollo system. #14944

Open YHao29 opened 1 year ago

YHao29 commented 1 year ago

When I use the perception fusion in apollo system, some questions confuse me a lot.

1. How to synchronize time between the camera and lidar? what is the exact protocol they use?

GPS can time all sensors on the vehicle, but different sensors have different acquisition frequencies and different hardware connections. GPS can provide PPS signals, but how does it achieve millisecond timing accuracy?

2. How to timestamp RGB images and point cloud images?

Most MSF algorithms use time stamps to align the data from different sources. Where is the relevant code about timestamp and alignment in apollo?

I will be so appreciate to you if you can give me some clue about these questions.

daohu527 commented 1 year ago

There are two ways to synchronize time

software synchronize

First there is a master sensor, and then based on that, find the closest timestamp to this master sensor, Fortunately, there are not so many types of sensors for autonomous driving. Lidar is generally used as the main sensor, because the frequency of the camera is usually fast enough, so approximate synchronization can be achieved.

|----------|----------|----------|----------|----------|      // fetch frames
o----------o----------o----------o----------o----------o     // lidar
x---x---x---x---x---x---x---x---x---x---x---x---x---x---x     // camera1
x---x---x--x----x---x--x---x----x--x---x---x----x---x--x-    // camera2

Since 30hz and 10hz can never be aligned, and because of the error, you can only ever get an approximate alignment. Of course, you can choose 25hz and 10hz, they can be aligned, but the existence of errors will also lead to misalignment, and you must ensure that the trigger time of the camera is consistent

hardware synchronize

Hardware synchronization can solve the above problems. Its principle is to use the phase of the lidar to trigger the camera to take pictures, so that the synchronization can be strictly guaranteed. But it needs hardware support.

YHao29 commented 1 year ago

I appreciate for your reply so much. The schematic diagram makes it easier for me to understand. But I have some detailed questions about the time synchronization.

Q: The protocol of the time synchronization for different sensors

Actually, we are using Apollo D-Kit to do some research. And we found that lidar is connected to GNSS/IMU directly but cameras are connected to IPC using USB3.0. Does this mean that lidar and camera synchronize their time using different protocols? and what exactly they are? Thanks for your reply!

daohu527 commented 1 year ago

For this case, we can only use software synchronization. If you mean the lidar and camera message format, Apollo unifies all messages in common_msg, where sensor messages are in modules/common_msgs/sensor_msgs

lidar https://github.com/ApolloAuto/apollo/blob/master/modules/common_msgs/sensor_msgs/pointcloud.proto

camera https://github.com/ApolloAuto/apollo/blob/master/modules/common_msgs/sensor_msgs/sensor_image.proto

They all have timestamp in their messages, then we can use them to find the latest sensor data

YHao29 commented 1 year ago

Thanks for your reply. The message of the lidar and camera actually is above the network layer. But when were the RGB images and point clouds timestamped?

As far as I know, in the normal scenario, point clouds are timestamped before they are transmitted to the IPC while RGB images are timestamped after they are transmitted to the IPC. Is this right? And could you tell me where we can find the process code of timestamping images and point clouds? Thanks a lot.