IntelRealSense / librealsense

Intel® RealSense™ SDK
https://www.intelrealsense.com/
Apache License 2.0
7.58k stars 4.82k forks source link

Inter cam sync mode option not working #10516

Closed v-lopez closed 2 years ago

v-lopez commented 2 years ago

Required Info
Camera Model D455
Firmware Version 05.13.00.50
Operating System & Version Linux
Kernel Version (Linux Only) 5.13.0-40-generic
Platform PC
SDK Version v2.50.0
Language C++
Segment Robot

Issue Description

We have two depth cameras with overlapping fields of view, and we'd like to use inter cam sync mode between them. We also need to use the "Emitter On Off" feature to capture half of the infrared images without the IR pattern. For that we need both cameras synchronized, otherwise, one camera is seeing the pattern of the other.

We have followed https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/RealSense_Multiple_Camera_WhitePaper.pdf and changed the proper settings (one camera as master, the other as slave) but each camera is seeing the pattern of the other. It seems both cameras are triggering frames independently.

Since we were having the issues we tried to go with a simpler approach, using only command line tools, and putting both cameras to Slave mode, with our understanding that no images would be captured, since no one was triggering them as there's no master camera or any device sending a pulse for triggering captures.

With all streams off using rs-sensor-control we put BOTH cameras to Slave mode, then we use rs-distance on one of the cameras, and we can see valid depth measurements, meaning that the Slave mode is being ignored, as no one should be triggering it.

Here's the full list of rs-sensor-control options.

Device information: 
  Name                 : Intel RealSense D455
  Serial Number        : 141322252685
  Firmware Version     : 05.13.00.50
  Recommended Firmware Version : 05.13.00.50
  Physical Port        : /sys/devices/pci0000:00/0000:00:14.0/usb4/4-4/4-4:1.0/video4linux/video24
  Debug Op Code        : 15
  Advanced Mode        : YES
  Product Id           : 0B5C
  Camera Locked        : YES
  Usb Type Descriptor  : 3.2
  Product Line         : D400
  Asic Serial Number   : 137623064168
  Firmware Update Id   : 137623064168
  Ip Address           : N/A

======================================================

Device consists of 3 sensors:

  0 : Stereo Module
  1 : RGB Camera
  2 : Motion Module

Select a sensor by index: 0

======================================================

What would you like to do with the sensor?

0 : Control sensor's options
1 : Control sensor's streams
2 : Show stream intrinsics
3 : Display extrinsics

Select an action: 0

======================================================

Sensor supports the following options:

  0: Backlight Compensation is not supported
  1: Brightness is not supported
  2: Contrast is not supported
  3: Exposure
       Description   : Depth Exposure (usec)
       Current Value : 33000
  4: Gain
       Description   : UVC image gain
       Current Value : 16
  5: Gamma is not supported
  6: Hue is not supported
  7: Saturation is not supported
  8: Sharpness is not supported
  9: White Balance is not supported
  10: Enable Auto Exposure
       Description   : Enable Auto Exposure
       Current Value : 1
  11: Enable Auto White Balance is not supported
  12: Visual Preset
       Description   : Advanced-Mode Preset
       Current Value : 0
  13: Laser Power
       Description   : Manual laser power in mw. applicable only when laser power mode is set to Manual
       Current Value : 360
  14: Accuracy is not supported
  15: Motion Range is not supported
  16: Filter Option is not supported
  17: Confidence Threshold is not supported
  18: Emitter Enabled
       Description   : Emitter select, 0-disable all emitters, 1-enable laser, 2-enable laser auto (opt), 3-enable LED (opt)
       Current Value : 1
  19: Frames Queue Size
       Description   : Max number of frames you can hold at a given time. Increasing this number will reduce frame drops but increase latency, and vice versa
       Current Value : 16
  20: Total Frame Drops is not supported
  21: Auto Exposure Mode is not supported
  22: Power Line Frequency is not supported
  23: Asic Temperature is not supported
  24: Error Polling Enabled
       Description   : Enable / disable polling of camera internal errors
       Current Value : 1
  25: Projector Temperature is not supported
  26: Output Trigger Enabled
       Description   : Generate trigger from the camera to external device once per frame
       Current Value : 0
  27: Motion Module Temperature is not supported
  28: Depth Units
       Description   : Number of meters represented by a single depth unit
       Current Value : 0.001
  29: Enable Motion Correction is not supported
  30: Auto Exposure Priority is not supported
  31: Color Scheme is not supported
  32: Histogram Equalization Enabled is not supported
  33: Min Distance is not supported
  34: Max Distance is not supported
  35: Texture Source is not supported
  36: Filter Magnitude is not supported
  37: Filter Smooth Alpha is not supported
  38: Filter Smooth Delta is not supported
  39: Holes Fill is not supported
  40: Stereo Baseline
       Description   : Distance in mm between the stereo imagers
       Current Value : 94.9598
  41: Auto Exposure Converge Step is not supported
  42: Inter Cam Sync Mode
       Description   : Inter-camera synchronization mode: 0:Default, 1:Master, 2:Slave, 3:Full Salve, 4-258:Genlock with burst count of 1-255 frames for each trigger, 259 and 260 for two frames per trigger with laser ON-OFF and OFF-ON.
       Current Value : 2
  43: Stream Filter is not supported
  44: Stream Format Filter is not supported
  45: Stream Index Filter is not supported
  46: Emitter On Off
       Description   : Alternating emitter pattern, toggled on/off on per-frame basis
       Current Value : 1
  47: Zero Order Point X is not supported
  48: Zero Order Point Y is not supported
  49: LDD temperature is not supported
  50: Mc Temperature is not supported
  51: Ma Temperature is not supported
  52: Hardware Preset is not supported
  53: Global Time Enabled
       Description   : Enable/Disable global timestamp
       Current Value : 1
  54: Apd Temperature is not supported
  55: Enable Mapping is not supported
  56: Enable Relocalization is not supported
  57: Enable Pose Jumping is not supported
  58: Enable Dynamic Calibration is not supported
  59: Depth Offset is not supported
  60: Led Power is not supported
  61: Zero Order Enabled is not supported
  62: Enable Map Preservation is not supported
  63: Freefall Detection Enabled is not supported
  64: Receiver Gain is not supported
  65: Post Processing Sharpening is not supported
  66: Pre Processing Sharpening is not supported
  67: Noise Filtering is not supported
  68: Invalidation Bypass is not supported
  69: Digital Gain is not supported
  70: Sensor Mode is not supported
  71: Emitter Always On
       Description   : Emitter always on mode: 0:disabled(default), 1:enabled
       Current Value : 0
  72: Thermal Compensation
       Description   : Toggle thermal compensation adjustments mechanism
       Current Value : 1
  73: Trigger Camera Accuracy Health is not supported
  74: Reset Camera Accuracy Health is not supported
  75: Host Performance is not supported
  76: Hdr Enabled
       Description   : HDR Option
       Current Value : 0
  77: Sequence Name
       Description   : HDR Option
       Current Value : 0
  78: Sequence Size
       Description   : HDR Option
       Current Value : 2
  79: Sequence Id
       Description   : HDR Option
       Current Value : 0
  80: Humidity Temperature is not supported
  81: Enable Max Usable Range is not supported
  82: Alternate IR is not supported
  83: Noise Estimation is not supported
  84: Enable IR Reflectivity is not supported
  85: Auto Exposure Limit
       Description   : Exposure limit is in microseconds. If the requested exposure limit is greater than frame time, it will be set to frame time at runtime. Setting will not take effect until next streaming session.
       Current Value : 200000
  86: Auto Gain Limit
       Description   : Gain limits ranges from 16 to 248. If the requested gain limit is less than 16, it will be set to 16. If the requested gain limit is greater than 248, it will be set to 248. Setting will not take effect until next streaming session.
       Current Value : 248
  87: Auto Rx Sensitivity is not supported
  88: Transmitter Frequency is not supported
  89: Vertical Binning is not supported
  90: Receiver Sensitivity is not supported
  91: Auto Exposure Limit Toggle
       Description   : Toggle Auto-Exposure Limit
       Current Value : 0
  92: Auto Gain Limit Toggle
       Description   : Toggle Auto-Gain Limit
       Current Value : 0
MartyG-RealSense commented 2 years ago

Hi @v-lopez Having two cameras in Slave mode (Inter Cam Sync Mode '2') without a trigger should not serve a useful purpose. In mode 2, slave cameras will listen for a trigger on each frame and if they do not detect a trigger within a certain time period then they stop listening and perform an independent unsynced capture on that frame. So it would not be very different to having no hardware sync set at all.

Hardware sync aims to have the slave cameras follow the timestamp timing of a master camera, so using hardware sync is unlikely to solve the problem of one camera seeing the infrared dot pattern projection of another camera.

If the cameras are positioned closely together then you could try disabling the projector of one of the slave cameras. Although it is preferable to have all cameras in a multiple camera scene projecting dots onto the scene (as the greater the total number of dots in a scene, the better the depth analysis can be), in this particular case it may be sufficient for two cameras observing a similar field of view to utilize one projection so that both cameras have a better chance of being synced in terms of whether the pattern is visible or invisible on a particular frame.

v-lopez commented 2 years ago

Thanks for the clarification, we had a misunderstanding about that.

Unfortunately, they only overlap like 20% of the field of view, so we need both IR patterns.

What about "Inter Cam Sync Mode" 259 and 260, I see it was added a few firmware versions ago, and I can see it on rs-sensor-control.

  42: Inter Cam Sync Mode
       Description   : Inter-camera synchronization mode: 0:Default, 1:Master, 2:Slave, 3:Full Salve, 4-258:Genlock with burst count of 1-255 frames for each trigger, 259 and 260 for two frames per trigger with laser ON-OFF and OFF-ON.
       Current Value : 1

Could we use either 259 or 260, with an external trigger, to synchronize the depth frame captures, and therefore the laser ON-OFF periods?

MartyG-RealSense commented 2 years ago

There are external IR dot pattern projectors with higher power output and greater range that can be used instead of the in-built projector, so you could position one projector in-between two cameras if purchasing such a projector is an option for your project.

Alternatively, RealSense 400 Series cameras can use the ambient light in a scene to analyze objects / surfaces for depth information instead of using the IR dot pattern. So increasing the overall strength of illumination in the scene could provide a way to not have to have the pattern enabled at all.

Inter Cam Sync Modes greater than 2 are the genlock type of hardware sync. There are some differences in how genlock-based hardware sync operates compared to mode 1 and 2. For example, slave cameras will wait indefinitely for a trigger and not take a capture if a trigger signal is not received from a master camera or an external signal generator. Genlock sync is described in a different multiple camera white-paper document at the link below.

https://dev.intelrealsense.com/docs/external-synchronization-of-intel-realsense-depth-cameras

A check of the SDK's code shows that references to mode 259 and 260 do exist. The SDK code describes the difference between 259 and 260 as being that with 259, the Laser ON frame is sent first and then the Laser OFF frame. With 260 it is the opposite - the Laser OFF frame is sent first and then the Laser ON frame.

https://github.com/IntelRealSense/librealsense/blob/master/src/ds5/ds5-options.cpp#L478-L481

v-lopez commented 2 years ago

Thanks, we'll give 259 and 260 a try and report back.

Unfortunately, our application would like to avoid extra weight, space, and power use of an emitter, and we may operate in pitch dark environments.

As always, thanks for the swift and great help!

MartyG-RealSense commented 2 years ago

Hi @v-lopez Do you have an update about this case that you can provide, please? Thanks!

v-lopez commented 2 years ago

We're still preparing the board to trigger the pulses for the Genlock. We'll update as soon as we have tested it.

On Mon, 23 May 2022 at 08:51, MartyG-RealSense @.***> wrote:

Hi @v-lopez https://github.com/v-lopez Do you have an update about this case that you can provide, please? Thanks!

— Reply to this email directly, view it on GitHub https://github.com/IntelRealSense/librealsense/issues/10516#issuecomment-1134252336, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA2PAXN32AI3TH3GGJID25TVLMTF7ANCNFSM5WAWCS2Q . You are receiving this because you were mentioned.Message ID: @.***>

MartyG-RealSense commented 2 years ago

Thanks very much. Good luck!

MartyG-RealSense commented 2 years ago

Hi @v-lopez Do you have an update about this case that you can provide, please? Thanks!

v-lopez commented 2 years ago

We don't have the board nor the cabling yet, but I used another camera as the pulse emitter, this pulse was received by another camera on sync mode 259 or 260.

With that basic test, both modes seemed to do what they advertised.

The remaining test is to have 2 cameras with overlapping fields of view, and trigger both of them externally to see if the IR emitters are synchronized.

Hopefully next week.

MartyG-RealSense commented 2 years ago

Thanks very much for the update!

MartyG-RealSense commented 2 years ago

Hi @v-lopez Do you require further assistance with this case, please? Thanks!

v-lopez commented 2 years ago

Yes I do, this is from my colleague:

I have read the article titled 'External Synchronization of Intel® RealSense™ Depth cameras', there, the sync signal trigger voltage is stated to be 1.8V.

Is this voltage the commonly named Input High Voltage (V_IH)? or is that the electrical rating of the digital input port? What is the maximum voltage that can be applied to this pin without damaging it?

The context of the issue is the fact that we want to synchronize several cameras and the initial devised level shifting method was a voltage divider, but we are concerned about the failure mode where one of the cameras is disconnected, thus changing the voltage divider and causing a higher voltage in the cameras that remain connected.

MartyG-RealSense commented 2 years ago

1.8V was likely selected as the supported trigger frequency because it is a common voltage setting that external signal generator hardware can be set to.

In regard to maximum supported voltage that can be applied to the pin, I have not seen cases where the voltage applied to a pin has exceeded 1.8V. RealSense users have used voltage shifters / level shifters to increase the 1.8V voltage travelling out to external devices such as LED flashes to 5V though - for example, at https://github.com/IntelRealSense/librealsense/issues/4574#issuecomment-549352828

MartyG-RealSense commented 2 years ago

Hi @v-lopez Do you have an update about this case that you can provide, please? Thanks!

MartyG-RealSense commented 2 years ago

Hi @v-lopez Do you require further assistance with this case, please? Thanks!

v-lopez commented 2 years ago

Hi @MartyG-RealSense, We are actually facing some issues.

As a summary our goal is to:

  1. Have a system composed of 5 cameras
  2. The cameras and their laser emitters are synchronized (image capture happens at the same time)
  3. The camera laser emitters needs to be toggled every pair of images (we need an image with laser for depth computation, and an image without laser for stereo tracking).
  4. We need to be able to identify which images are captured with and without laser emitter, to filter them.
  5. The timestamp of the images must be cohesive (images captured at the same moment, need to have the same timestamp).

Current status:

  1. We achieved this a few months ago as discussed here using V4L: https://github.com/IntelRealSense/realsense-ros/issues/2242
  2. We have finished building the camera synchronization setup. And we are able to get synchronized images, we only tested with 2 cameras so far, but for now we assume it will work.
  3. We seem to be able to achieve this with sync mode 260 (emitter OFF, then ON). If we use 259 (ON then OFF) the cameras sometimes capture the emitter from the other cameras.
  4. This is a problem. I have seen no way of determining if the image was captured with or without the emitter.

I have tried a simple sequence counter, in mode 260, odd images have emitter OFF, and even images have emitter ON. But as soon as a frame is lost, the logic is inverted and I have no way of detecting the lost frame.

I have tried measuring the time difference between received images, I am running at 20Hz, and since in sync mode 260 two images are captured in sequence, I assumed the relative time between OFF-ON images would be small (<10ms), and the time between OFF-OFF images would be ~50ms (1/20Hz). But the stamp difference between OFF-ON oscillates between 1 and 40ms, so it's unusable.

I suspected that the two attempts above could be due to the sequence being computed on the images I actually receive, and the timestamp is set by the driver since I am using V4L. So I have recompiled with RSUSB, so I can read image metadata, and try to use the timestamps (RS2_FRAME_METADATA_SENSOR_TIMESTAMP) and sequence numbers (RS2_FRAME_METADATA_FRAME_COUNTER) there to solve the issues above. But I see the same problem.

Finally, with RSUSB, the metadata contains a RS2_FRAME_METADATA_FRAME_LED_POWER field. In sync mode 2 (Slave), with the emitter on/off option, this metadata oscillates between 0 and the configured LED POWER. But on sync mode 260 it is always set to the configured LED_POWER, even though the emitter is oscillating.

And relative to 5., the timestamps of separate cameras triggered simultaneously don't match. Even though I can see that the images are captured at the same time, the stamp of them differs by up to 23ms.

So my questions right now are.

How can I find out in a deterministic way, whether an image has been captured with the laser emitter on? How can I get synchronized time stamps from independent cameras? Can this work with V4L? Because I could not get 5 cameras with RSUSB.

v-lopez commented 2 years ago

Small rectification, it seems that RS2_FRAME_METADATA_SENSOR_TIMESTAMP is actually quite constant.

At 50Hz, I get 20ms time difference between OFF-ON frames, and 37 between ON-OFF on mode 260. So I guess I can put some thresholds based on this, but this still binds me to RSUSB. Which I am not sure it'll handle 5 cameras.

MartyG-RealSense commented 2 years ago

You could check the emitter status on each frame with the C++ instruction GET_OPTION(RS2_OPTION_EMITTER_ENABLED) - 0 = disabled and 1 = enabled.

If the image that you are capturing is being saved to a file with a unique filename string then you could conceivably add the word ON or OFF to the filename for that particular saved frame depending on whether the result of the get_option check was 0 (off) or 1 (on).

If you are not saving images to file then you could instead print a list of frame numbers in the form of a string with the frame number and ON or OFF added beside the number depending on the outcome of the emitter check on that frame.

Multiple cameras do work best with V4L. For a comparison of the advantages and disadvantages of these two installation methods (V4L and LIBUVC / RSUSB), you can visit the comment https://github.com/IntelRealSense/librealsense/issues/5212#issuecomment-552184604 and scroll down to the section headed What are the advantages and disadvantages of using libuvc vs patched kernel modules?

v-lopez commented 2 years ago

GET_OPTION(RS2_OPTION_EMITTER_ENABLED) returns enabled always, it does not reflect what sync mode 260 does (alternating emitter off and emitter on).

Once I can determine which is which, I can workaround what to do with each image, but right now figuring out which image has emitter is the blocking point.

v-lopez commented 2 years ago

@MartyG-RealSense What I do not understand is why the frame number obtained with f.get_frame_number() and in theory published by the device does not correspond to odd = emitter ON, even = emitter OFF for sync mode 260.

Is this a bug? I assumed that even if some images are not captured due to USB communication errors, I might skip a frame number in a sequence, but since the camera always triggers images in pairs of OFF/ON, this should not change. And it does change every few seconds in my experiments.

MartyG-RealSense commented 2 years ago

It sounds similar to a RealSense user at https://github.com/IntelRealSense/librealsense/issues/9528#issuecomment-917048855 who was also using sync mode 260 and could not tell whether the emitter was on or off.

They also mentioned the problem of frames getting "stuck in the pipeline somewhere" and described their eventual workaround solution at https://github.com/IntelRealSense/librealsense/issues/9528#issuecomment-923279735

v-lopez commented 2 years ago

Ignore this post, the camera that was acting as master was damaged and was emitting no pulse. That's why mode 260 was failing

Unfortunately, that doesn't solve our issue.

mera to mode 1, and the others to 260, lead to many errors like:

[ERROR] [1657281357.604906868]: Exception: set_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such device
 08/07 11:55:57,604 ERROR [140576561297152] (librealsense-exception.h:52) get_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such device
 08/07 11:55:57,604 ERROR [140576561297152] (ds5-thermal-monitor.cpp:94) Error during thermal compensation handling: get_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such device
 08/07 11:55:57,604 ERROR [140576016033536] (librealsense-exception.h:52) get_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such file or directory
 08/07 11:55:57,604 ERROR [140576016033536] (error-handling.cpp:94) Error during polling error handler: get_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such file or directory
 08/07 11:55:57,604 ERROR [140576552904448] (librealsense-exception.h:52) set_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such device
 08/07 11:55:57,604 ERROR [140575940499200] (librealsense-exception.h:52) set_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such device
 08/07 11:55:57,604 ERROR [140575940499200] (global_timestamp_reader.cpp:239) Error during time_diff_keeper polling: set_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such device
 08/07 11:55:57,704 ERROR [140575940499200] (librealsense-exception.h:52) acquire: Cannot open '/dev/video6 Last Error: No such file or directory
 08/07 11:55:57,704 ERROR [140575940499200] (global_timestamp_reader.cpp:239) Error during time_diff_keeper polling: acquire: Cannot open '/dev/video6 Last Error: No such file or directory
 08/07 11:55:57,805 ERROR [140575940499200] (librealsense-exception.h:52) acquire: Cannot open '/dev/video6 Last Error: No such file or directory
 08/07 11:55:57,805 ERROR [140575940499200] (global_timestamp_reader.cpp:239) Error during time_diff_keeper polling: acquire: Cannot open '/dev/video6 Last Error: No such file or directory
 08/07 11:55:57,905 ERROR [140575940499200] (librealsense-exception.h:52) acquire: Cannot open '/dev/video6 Last Error: No such file or directory
 08/07 11:55:57,905 ERROR [140575940499200] (global_timestamp_reader.cpp:239) Error during time_diff_keeper polling: acquire: Cannot open '/dev/video6 Last Error: No such file or directory
[ERROR] [1657281357.928359977]: The device has been disconnected!
 08/07 11:55:57,926 WARNING [140576960911104] (ds5-factory.cpp:1152) DS5 group_devices is empty.
 08/07 11:55:57,928 WARNING [140576935732992] (ds5-factory.cpp:1152) DS5 group_devices is empty.
 08/07 11:55:57,928 WARNING [140576927340288] (ds5-factory.cpp:1152) DS5 group_devices is empty.
 08/07 11:55:58,005 ERROR [140575940499200] (librealsense-exception.h:52) acquire: Cannot open '/dev/video6 Last Error: No such file or directory
 08/07 11:55:58,005 ERROR [140575940499200] (global_timestamp_reader.cpp:239) Error during time_diff_keeper polling: acquire: Cannot open '/dev/video6 Last Error: No such file or directory
 08/07 11:55:58,105 ERROR [140575940499200] (librealsense-exception.h:52) acquire: Cannot open '/dev/video6 Last Error: No such file or directory
 08/07 11:55:58,105 ERROR [140575940499200] (global_timestamp_reader.cpp:239) Error during time_diff_keeper polling: acquire: Cannot open '/dev/video6 Last Error: No such file or directory
 08/07 11:55:58,108 WARNING [140576960911104] (ds5-factory.cpp:1152) DS5 group_devices is empty.
 08/07 11:55:58,111 WARNING [140576935732992] (ds5-factory.cpp:1152) DS5 group_devices is 

We can switch to 1 camera in mode 1, and 4 cameras in mode 2. But the emitters are not synchronized either. As an example, I am deciding to publish an image based on:

            emitter_on = f.get_frame_metadata(RS2_FRAME_METADATA_FRAME_EMITTER_MODE) > 0;

The left camera is the master, and the other two are slaves. I am only displaying the frames tagged with RS2_FRAME_METADATA_FRAME_EMITTER_MODE > 0, and as you can see all the cameras are displaying some pattern from another camera. The slaves are displaying the pattern from the master camera, and the master camera is displaying from the slaves.

But this is not deterministic, sometimes the slaves cameras will display the pattern from the other slaves. For instance in the screenshot below, where the master camera has no interference, but the central camera is displaying the pattern from the other two cameras: Screenshot from 2022-07-08 15-32-00

v-lopez commented 2 years ago

We have seen that our master camera may have not been triggering a pulse, maybe due to a faulty connector.

We'll repeat these tests on Monday after validating the individual components.

Please disregard the previous message for now.

MartyG-RealSense commented 2 years ago

Thanks very much @v-lopez for the update. I look forward to your next test results. Good luck!

v-lopez commented 2 years ago

My last tests were failing due to a defective camera not emitting pulses.

Replacing that camera, we can achieve putting 1 master camera and 3 other cameras in mode 260. With the Linux Kernel patch, the frame metadata RS2_FRAME_METADATA_FRAME_EMITTER_MODE indicates whether the frame was captured with out without emitter.

The remaining bit are the timestamps, even though all the frames are captured simultaneously, the timestamps of the images differ.

MartyG-RealSense commented 2 years ago

Are you able to achieve a closer sync between the timestamps of the 3 other cameras if Global Time is set to true, as described at https://github.com/IntelRealSense/librealsense/pull/3909

v-lopez commented 2 years ago

I am running with Global Time. But I still see time stamp differences of up to 22 milliseconds between time stamps, running at 15Hz.

MartyG-RealSense commented 2 years ago

How long does it take for differences between the timestamps to become noticable? Intel's original multiple camera white-paper document states that if hardware sync is working correctly then timestamps will very slowly drift apart over time, and if they remain constant then it is actually a sign that hardware sync is not working. To quote the paper:


If NO HW sync is enabled, the time stamps will now surprisingly appear to be perfectly aligned. This is because each individual ASIC is counting the exact same number of cycles between frames, and then sending them off. So according to their own time-lines they are sending frames at say, exactly 33.333ms intervals, for a 30fps mode.

By contrast, if HW Sync is enabled, the time stamps will actually drift over time. You might expect this drift to be on the order of less than 1ms/minute. Knowing this, you can actually validate that your sync is working by looking at the drift in the difference of the time stamps. If you see NO DRIFT, then there is NO HW sync. If you see DRIFT, then the units are actually HW synced.


The quoted information can be found at the link below.

https://dev.intelrealsense.com/docs/multiple-depth-cameras-configuration#3-multi-camera-programming

v-lopez commented 2 years ago

So here are the results from a test.

I started 4 cameras in mode 260. Once they were ready I started a 5th camera with mode 1 at 30Hz.

After running for half an hour this is one metadata snapshot of all the slave cameras taken roughly at the same instant:

{
  "frame_number": 104685,
  "clock_domain": "global_time",
  "frame_timestamp": 1657894800065.5466,
  "  frame_counter": 104685,
  "hw_timestamp": 3554260334,
  "sensor_timestamp": 3554256206,
  "  actual_exposure": 8256,
  "gain_level": 35,
  "auto_exposure": 1,
  "time_of_arrival": 1657894800073,
  "backend_timestamp": 1657894800066,
  "actual_fps": 60,
  "frame_laser_power": 0,
  "frame_laser_power_mode": 0,
  "exposure_priority": 1,
  "exposure_roi_left": 0,
  "  exposure_roi_right": 639,
  "exposure_roi_top": 0,
  "exposure_roi_bottom": 359,
  "frame_emitter_mode": 0,
  "raw_frame_size": 460800,
  "gpio_input_data": 0,
  "sequence_name": 0,
  "sequence_id": 0,
  "sequence_size": 0
}
{
  "frame_number": 104886,
  "clock_domain": "global_time",
  "frame_timestamp": 1657894800038.1367,
  "  frame_counter": 104886,
  "hw_timestamp": 3551913717,
  "sensor_timestamp": 3551909589,
  "  actual_exposure": 8256,
  "gain_level": 56,
  "auto_exposure": 1,
  "time_of_arrival": 1657894800040,
  "backend_timestamp": 1657894800033,
  "actual_fps": 60,
  "frame_laser_power": 0,
  "frame_laser_power_mode": 0,
  "exposure_priority": 1,
  "exposure_roi_left": 0,
  "  exposure_roi_right": 639,
  "exposure_roi_top": 0,
  "exposure_roi_bottom": 359,
  "frame_emitter_mode": 0,
  "raw_frame_size": 460800,
  "gpio_input_data": 0,
  "sequence_name": 0,
  "sequence_id": 0,
  "sequence_size": 0
}
  "frame_number": 104879,
  "clock_domain": "global_time",
  "frame_timestamp": 1657894800065.515,
  "  frame_counter": 104879,
  "hw_timestamp": 3551418669,
  "sensor_timestamp": 3551414541,
  "  actual_exposure": 8256,
  "gain_level": 45,
  "auto_exposure": 1,
  "time_of_arrival": 1657894800073,
  "backend_timestamp": 1657894800066,
  "actual_fps": 60,
  "frame_laser_power": 0,
  "frame_laser_power_mode": 0,
  "exposure_priority": 1,
  "exposure_roi_left": 0,
  "  exposure_roi_right": 639,
  "exposure_roi_top": 0,
  "exposure_roi_bottom": 359,
  "frame_emitter_mode": 0,
  "raw_frame_size": 460800,
  "gpio_input_data": 0,
  "sequence_name": 0,
  "sequence_id": 0,
  "sequence_size": 0
}
{
  "frame_number": 104365,
  "clock_domain": "global_time",
  "frame_timestamp": 1657894800030.8477,
  "  frame_counter": 104365,
  "hw_timestamp": 3554513529,
  "sensor_timestamp": 3554509401,
  "  actual_exposure": 8256,
  "gain_level": 41,
  "auto_exposure": 1,
  "time_of_arrival": 1657894800040,
  "backend_timestamp": 1657894800033,
  "actual_fps": 60,
  "frame_laser_power": 0,
  "frame_laser_power_mode": 0,
  "exposure_priority": 1,
  "exposure_roi_left": 0,
  "  exposure_roi_right": 639,
  "exposure_roi_top": 0,
  "exposure_roi_bottom": 359,
  "frame_emitter_mode": 0,
  "raw_frame_size": 460800,
  "gpio_input_data": 250,
  "sequence_name": 0,
  "sequence_id": 0,
  "sequence_size": 0
}

The frame numbers are up to 521 frames apart. And the frame_timestamp up to 35ms apart.

How The frame numbers are up to 521 frames apart. And the frame_timestamp up to 35ms apart

I understand the reason why the timestamps are differing, and this seems to be in line with the 1ms/minute. But how can this be compensated? Our cameras may be running for many hours, even days without shutting down.

If the frame_numbers were kept synchronized, I could find all the images taken in the same instant using the frame_number and work on that, but it seems not to be reliable.

MartyG-RealSense commented 2 years ago

At the link below, a RealSense team member provides advice on how to adjust the frame counter to take account of drifting.

https://community.intel.com/t5/Items-with-no-label/Question-about-Frame-Sync-Index-D435-HW-sync/m-p/600573

I have also attached a PDF copy of that discussion that can be downloaded in the browser.

Question about Frame Sync Index.pdf

v-lopez commented 2 years ago

Hi @MartyG-RealSense, thanks for that link, it is very helpful.

Unfortunately, I am seeing my FRAME COUNTERS drift.

Cam ID Counter diff at t=0 Counter diff at t=300 Counter diff at t=600 Counter diff at t=900 Counter diff at t=1800
Master 0 0 0 0 0
S1 +5 +6 +4 +4 +0
S2 +3 -6 -9 -14 -76
S3 +3 +3 +3 +5 +4
S4 +3 +1 +4 +4 +3

I am not getting any reset due to ESD, because the drift slowly increases over time, and it never goes back to 0. At t=900s the master counter is 26251, and all the others are around it.

I am also starting all the slave cameras first, then starting the master one, so all cameras are ready to start capturing right away.

MartyG-RealSense commented 2 years ago

A RealSense team member suggests at https://github.com/IntelRealSense/realsense-ros/issues/1906#issuecomment-866645537 that if there is a large offset between the FRAME_TIMESTAMP and TIME_OF_ARRIVAL timestamps then the system may be taking time to pass data between the USB host and the RealSense SDK. I believe that this is referring to timestamps on single cameras though and not multiple hardware-synced ones.

It is also advised in https://github.com/IntelRealSense/librealsense/issues/2707#issuecomment-438771890 that the Master should always be set first, and the Slaves set and run after that.

v-lopez commented 2 years ago

I think the second comment that you mention holds true for Sync Mode = 2. But since we're using 260, the slave cameras wait until there's a pulse signal. And don't trigger anything until then.

Also, the problem is that the drift increases over time. If it was a constant one I could work around it as mentioned in your second to the last comment.

MartyG-RealSense commented 2 years ago

If you are able to adjust the frame counter for individual cameras, perhaps you could periodically use a simple logic check to correct the drift.

For each slave camera's frame count, IF slave frame counter != (does not equal) master frame counter THEN slave frame counter = master frame counter

v-lopez commented 2 years ago

But the root cause that I am trying to fix is to identify which frames were captured simultaneously and make sure they have the same timestamp.

Each camera is setting a different timestamp even with Global Time, so I cannot use this information. I cannot assume that the last image that arrives frame counter = to the previous master frame counter, because:

Anders_G_intel mentions here:

If the units are HW synced the difference in frame count between different cameras should NEVER drift.

I am seeing a frame count drift on most cameras, and not due to ESD reset.

MartyG-RealSense commented 2 years ago

The Anders section quoted states "If you set frame buffers to minimum, and OS has good driver, and you don't lose frames, then everything should be fine". This poses the possibility of whether the frame counts are not aligned because frames are being dropped.

v-lopez commented 2 years ago

I believe that if I was losing frames, I would have gaps in the sequence (which might be happening, but is not an issue right now), but as I had understood, the ASIC increments the frame counter regardless of whether the frame is later dropped.

So over time I should not get drifting frame counters.

I have visually compared 2 frames received at a similar time, with a 200-frame counter drift, and I have verified that they were captured at the same instant (they both were facing a stopwatch with 10ms precision, and displayed the same instant).

MartyG-RealSense commented 2 years ago

There are a range of factors in this case that may make achieving the desired result difficult.

Capturing alternating frames where the emitter is completely off or completely on in the saved image is difficult even without multiple camera sync.

A more straightforward approach is to capture images from all attached cameras simultaneously without sync like the C++ script at https://github.com/IntelRealSense/librealsense/issues/2219#issuecomment-412899468 does, though I understand that having images both for emitter on and emitter off is important to your requirements.

v-lopez commented 2 years ago

Thanks for your time and help @MartyG-RealSense.

I am closing this issue. As the original problem is solved already and is totally different from what we were discussing these last few days.

I'll look into creating such a script or writing our custom librealsense client to have better control of the timings and see if that improves the result.

Best regards.

MartyG-RealSense commented 2 years ago

Thanks very much too @v-lopez for your patience during this case. Good luck!