orbbec / OrbbecSDK

Orbbec SDK v1&v2 Pre-Compiled Repo
https://www.orbbec3d.com/
Other
119 stars 16 forks source link

Femto Mega, multi device HW sync, auto timestamp correction of subordinate devices is not happening. #27

Open bumbastic opened 1 year ago

bumbastic commented 1 year ago

I noticed that timestamps is drifting apart in a multi device configuration with sync. cables, one primary, and 1-8 subordinates. I had to detect my self when they drift beond the point where they no longer can be corrected to match the primary + trigger delay, without risking jumping to the timestamp of the next or prev. frame.

Azure kinect's is automatically adjusting the timestamps of subordinates to match the timestamps of the primary, with high accuracy < 60 micro seconds.

Can I expect this to be fixed in the new coming firmware for Femto Mega, or is this something I have to handle my self?

hzcyf commented 1 year ago

Can you describe your test methods and test data in detail?

bumbastic commented 1 year ago

The timing logic within a multi-camera capture system with hardware synchronization is designed to ensure that all cameras are capturing images at precisely the same moment. This is critical in applications such as 3D reconstruction, virtual reality, and motion analysis, where data from multiple viewpoints must be combined.

Here's a scientific description of the timing logic, abstracted from any specific code implementation:

Primary Camera Timestamping: In the system, one camera is designated as the primary camera. This camera acts as the master clock, providing the baseline timestamps for synchronization. It also emits a hardware trigger signal that prompts the subordinate cameras to capture images.

Capture Windows and Tolerance: Each camera is set to capture within a predefined time window around the trigger signal. This window accounts for predictable delays and uncertainties such as electronic signaling and processing time. A synchronization tolerance is established to determine the acceptable deviation from the precise target capture time.

Temporal Alignment and Correction: The captured images from all cameras are analyzed to find their temporal offsets relative to the primary camera's timestamp. If the offsets are within the acceptable tolerance, the frames are considered synchronized. If the offsets exceed this tolerance, temporal alignment techniques are applied. This can involve either discarding the unsynchronized frames or adjusting their timestamps by a calculated offset derived from the system's understanding of each camera's delay characteristics.

Clock Drift Compensation: Over time, the internal clocks of the cameras may drift apart. This is counteracted by regular recalibration of the timestamps. The system may periodically check the current time on each camera against the master and make adjustments to minimize the drift.

Frame Rate Considerations: At a typical frame rate of 30 FPS, each frame represents an interval of approximately 33.33 milliseconds. The system's tolerance for timestamp misalignment must be significantly less than this interval to ensure that each frame from the multiple cameras can be accurately correlated in time.

Hardware Trigger Precision: The system relies on the precision of the hardware trigger, which is typically much finer than the capture interval. This trigger usually has a precision on the order of microseconds, ensuring that all cameras receive the capture signal almost instantaneously relative to their capture rate.

Timestamp Synchronization Events: If significant discrepancies are detected, a synchronization event may be triggered to correct the system-wide timing. This can involve resynchronizing the clocks of all cameras to the primary camera's clock.

Continuous Monitoring and Adjustment: The system continually monitors the timestamps of the frames being captured. When a camera's timestamp falls outside the defined synchronization tolerance, it triggers an adjustment protocol to bring that camera's timing back in line with the system standard.

By maintaining strict temporal discipline across the cameras, the system ensures that the final data set is temporally coherent, with all frames aligning to create a unified multi-perspective view of the captured scene. This temporal coherence is foundational for accurate motion tracking, 3D modeling, and any application where data from multiple cameras must be combined into a single, synchronized representation of an event or environment.

In my case i use the SDK to sync the timestamps when ever I detect a subordinate camera more than 3ms offset from the primary camera. I buffer the frames with timestamps as key, look up most resent frames with similar timestamps +/-5ms, they are all from the same hardware trigger. I offset all subordinate timestamps to match the primary + depth trigger delay. As long as the timestamp synchronization keeps the synchronization accuracy <5ms it works fine, and this seems to be the case in my tests. I connect over ethernet thru a POE switch.