Closed v-lopez closed 2 years ago
Hi @v-lopez Having two cameras in Slave mode (Inter Cam Sync Mode '2') without a trigger should not serve a useful purpose. In mode 2, slave cameras will listen for a trigger on each frame and if they do not detect a trigger within a certain time period then they stop listening and perform an independent unsynced capture on that frame. So it would not be very different to having no hardware sync set at all.
Hardware sync aims to have the slave cameras follow the timestamp timing of a master camera, so using hardware sync is unlikely to solve the problem of one camera seeing the infrared dot pattern projection of another camera.
If the cameras are positioned closely together then you could try disabling the projector of one of the slave cameras. Although it is preferable to have all cameras in a multiple camera scene projecting dots onto the scene (as the greater the total number of dots in a scene, the better the depth analysis can be), in this particular case it may be sufficient for two cameras observing a similar field of view to utilize one projection so that both cameras have a better chance of being synced in terms of whether the pattern is visible or invisible on a particular frame.
Thanks for the clarification, we had a misunderstanding about that.
Unfortunately, they only overlap like 20% of the field of view, so we need both IR patterns.
What about "Inter Cam Sync Mode" 259 and 260, I see it was added a few firmware versions ago, and I can see it on rs-sensor-control
.
42: Inter Cam Sync Mode
Description : Inter-camera synchronization mode: 0:Default, 1:Master, 2:Slave, 3:Full Salve, 4-258:Genlock with burst count of 1-255 frames for each trigger, 259 and 260 for two frames per trigger with laser ON-OFF and OFF-ON.
Current Value : 1
Could we use either 259 or 260, with an external trigger, to synchronize the depth frame captures, and therefore the laser ON-OFF periods?
There are external IR dot pattern projectors with higher power output and greater range that can be used instead of the in-built projector, so you could position one projector in-between two cameras if purchasing such a projector is an option for your project.
Alternatively, RealSense 400 Series cameras can use the ambient light in a scene to analyze objects / surfaces for depth information instead of using the IR dot pattern. So increasing the overall strength of illumination in the scene could provide a way to not have to have the pattern enabled at all.
Inter Cam Sync Modes greater than 2 are the genlock type of hardware sync. There are some differences in how genlock-based hardware sync operates compared to mode 1 and 2. For example, slave cameras will wait indefinitely for a trigger and not take a capture if a trigger signal is not received from a master camera or an external signal generator. Genlock sync is described in a different multiple camera white-paper document at the link below.
https://dev.intelrealsense.com/docs/external-synchronization-of-intel-realsense-depth-cameras
A check of the SDK's code shows that references to mode 259 and 260 do exist. The SDK code describes the difference between 259 and 260 as being that with 259, the Laser ON frame is sent first and then the Laser OFF frame. With 260 it is the opposite - the Laser OFF frame is sent first and then the Laser ON frame.
https://github.com/IntelRealSense/librealsense/blob/master/src/ds5/ds5-options.cpp#L478-L481
Thanks, we'll give 259 and 260 a try and report back.
Unfortunately, our application would like to avoid extra weight, space, and power use of an emitter, and we may operate in pitch dark environments.
As always, thanks for the swift and great help!
Hi @v-lopez Do you have an update about this case that you can provide, please? Thanks!
We're still preparing the board to trigger the pulses for the Genlock. We'll update as soon as we have tested it.
On Mon, 23 May 2022 at 08:51, MartyG-RealSense @.***> wrote:
Hi @v-lopez https://github.com/v-lopez Do you have an update about this case that you can provide, please? Thanks!
— Reply to this email directly, view it on GitHub https://github.com/IntelRealSense/librealsense/issues/10516#issuecomment-1134252336, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA2PAXN32AI3TH3GGJID25TVLMTF7ANCNFSM5WAWCS2Q . You are receiving this because you were mentioned.Message ID: @.***>
Thanks very much. Good luck!
Hi @v-lopez Do you have an update about this case that you can provide, please? Thanks!
We don't have the board nor the cabling yet, but I used another camera as the pulse emitter, this pulse was received by another camera on sync mode 259 or 260.
With that basic test, both modes seemed to do what they advertised.
The remaining test is to have 2 cameras with overlapping fields of view, and trigger both of them externally to see if the IR emitters are synchronized.
Hopefully next week.
Thanks very much for the update!
Hi @v-lopez Do you require further assistance with this case, please? Thanks!
Yes I do, this is from my colleague:
I have read the article titled 'External Synchronization of Intel® RealSense™ Depth cameras', there, the sync signal trigger voltage is stated to be 1.8V.
Is this voltage the commonly named Input High Voltage (V_IH)? or is that the electrical rating of the digital input port? What is the maximum voltage that can be applied to this pin without damaging it?
The context of the issue is the fact that we want to synchronize several cameras and the initial devised level shifting method was a voltage divider, but we are concerned about the failure mode where one of the cameras is disconnected, thus changing the voltage divider and causing a higher voltage in the cameras that remain connected.
1.8V was likely selected as the supported trigger frequency because it is a common voltage setting that external signal generator hardware can be set to.
In regard to maximum supported voltage that can be applied to the pin, I have not seen cases where the voltage applied to a pin has exceeded 1.8V. RealSense users have used voltage shifters / level shifters to increase the 1.8V voltage travelling out to external devices such as LED flashes to 5V though - for example, at https://github.com/IntelRealSense/librealsense/issues/4574#issuecomment-549352828
Hi @v-lopez Do you have an update about this case that you can provide, please? Thanks!
Hi @v-lopez Do you require further assistance with this case, please? Thanks!
Hi @MartyG-RealSense, We are actually facing some issues.
As a summary our goal is to:
Current status:
I have tried a simple sequence counter, in mode 260, odd images have emitter OFF, and even images have emitter ON. But as soon as a frame is lost, the logic is inverted and I have no way of detecting the lost frame.
I have tried measuring the time difference between received images, I am running at 20Hz, and since in sync mode 260 two images are captured in sequence, I assumed the relative time between OFF-ON images would be small (<10ms), and the time between OFF-OFF images would be ~50ms (1/20Hz). But the stamp difference between OFF-ON oscillates between 1 and 40ms, so it's unusable.
I suspected that the two attempts above could be due to the sequence being computed on the images I actually receive, and the timestamp is set by the driver since I am using V4L. So I have recompiled with RSUSB, so I can read image metadata, and try to use the timestamps (RS2_FRAME_METADATA_SENSOR_TIMESTAMP) and sequence numbers (RS2_FRAME_METADATA_FRAME_COUNTER) there to solve the issues above. But I see the same problem.
Finally, with RSUSB, the metadata contains a RS2_FRAME_METADATA_FRAME_LED_POWER field. In sync mode 2 (Slave), with the emitter on/off option, this metadata oscillates between 0 and the configured LED POWER. But on sync mode 260 it is always set to the configured LED_POWER, even though the emitter is oscillating.
And relative to 5., the timestamps of separate cameras triggered simultaneously don't match. Even though I can see that the images are captured at the same time, the stamp of them differs by up to 23ms.
So my questions right now are.
How can I find out in a deterministic way, whether an image has been captured with the laser emitter on? How can I get synchronized time stamps from independent cameras? Can this work with V4L? Because I could not get 5 cameras with RSUSB.
Small rectification, it seems that RS2_FRAME_METADATA_SENSOR_TIMESTAMP is actually quite constant.
At 50Hz, I get 20ms time difference between OFF-ON frames, and 37 between ON-OFF on mode 260. So I guess I can put some thresholds based on this, but this still binds me to RSUSB. Which I am not sure it'll handle 5 cameras.
You could check the emitter status on each frame with the C++ instruction GET_OPTION(RS2_OPTION_EMITTER_ENABLED) - 0 = disabled and 1 = enabled.
If the image that you are capturing is being saved to a file with a unique filename string then you could conceivably add the word ON or OFF to the filename for that particular saved frame depending on whether the result of the get_option check was 0 (off) or 1 (on).
If you are not saving images to file then you could instead print a list of frame numbers in the form of a string with the frame number and ON or OFF added beside the number depending on the outcome of the emitter check on that frame.
Multiple cameras do work best with V4L. For a comparison of the advantages and disadvantages of these two installation methods (V4L and LIBUVC / RSUSB), you can visit the comment https://github.com/IntelRealSense/librealsense/issues/5212#issuecomment-552184604 and scroll down to the section headed What are the advantages and disadvantages of using libuvc vs patched kernel modules?
GET_OPTION(RS2_OPTION_EMITTER_ENABLED) returns enabled always, it does not reflect what sync mode 260 does (alternating emitter off and emitter on).
Once I can determine which is which, I can workaround what to do with each image, but right now figuring out which image has emitter is the blocking point.
@MartyG-RealSense What I do not understand is why the frame number obtained with f.get_frame_number()
and in theory published by the device does not correspond to odd = emitter ON, even = emitter OFF for sync mode 260.
Is this a bug? I assumed that even if some images are not captured due to USB communication errors, I might skip a frame number in a sequence, but since the camera always triggers images in pairs of OFF/ON, this should not change. And it does change every few seconds in my experiments.
It sounds similar to a RealSense user at https://github.com/IntelRealSense/librealsense/issues/9528#issuecomment-917048855 who was also using sync mode 260 and could not tell whether the emitter was on or off.
They also mentioned the problem of frames getting "stuck in the pipeline somewhere" and described their eventual workaround solution at https://github.com/IntelRealSense/librealsense/issues/9528#issuecomment-923279735
Ignore this post, the camera that was acting as master was damaged and was emitting no pulse. That's why mode 260 was failing
Unfortunately, that doesn't solve our issue.
mera to mode 1, and the others to 260, lead to many errors like:
[ERROR] [1657281357.604906868]: Exception: set_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such device
08/07 11:55:57,604 ERROR [140576561297152] (librealsense-exception.h:52) get_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such device
08/07 11:55:57,604 ERROR [140576561297152] (ds5-thermal-monitor.cpp:94) Error during thermal compensation handling: get_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such device
08/07 11:55:57,604 ERROR [140576016033536] (librealsense-exception.h:52) get_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such file or directory
08/07 11:55:57,604 ERROR [140576016033536] (error-handling.cpp:94) Error during polling error handler: get_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such file or directory
08/07 11:55:57,604 ERROR [140576552904448] (librealsense-exception.h:52) set_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such device
08/07 11:55:57,604 ERROR [140575940499200] (librealsense-exception.h:52) set_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such device
08/07 11:55:57,604 ERROR [140575940499200] (global_timestamp_reader.cpp:239) Error during time_diff_keeper polling: set_xu(...). xioctl(UVCIOC_CTRL_QUERY) failed Last Error: No such device
08/07 11:55:57,704 ERROR [140575940499200] (librealsense-exception.h:52) acquire: Cannot open '/dev/video6 Last Error: No such file or directory
08/07 11:55:57,704 ERROR [140575940499200] (global_timestamp_reader.cpp:239) Error during time_diff_keeper polling: acquire: Cannot open '/dev/video6 Last Error: No such file or directory
08/07 11:55:57,805 ERROR [140575940499200] (librealsense-exception.h:52) acquire: Cannot open '/dev/video6 Last Error: No such file or directory
08/07 11:55:57,805 ERROR [140575940499200] (global_timestamp_reader.cpp:239) Error during time_diff_keeper polling: acquire: Cannot open '/dev/video6 Last Error: No such file or directory
08/07 11:55:57,905 ERROR [140575940499200] (librealsense-exception.h:52) acquire: Cannot open '/dev/video6 Last Error: No such file or directory
08/07 11:55:57,905 ERROR [140575940499200] (global_timestamp_reader.cpp:239) Error during time_diff_keeper polling: acquire: Cannot open '/dev/video6 Last Error: No such file or directory
[ERROR] [1657281357.928359977]: The device has been disconnected!
08/07 11:55:57,926 WARNING [140576960911104] (ds5-factory.cpp:1152) DS5 group_devices is empty.
08/07 11:55:57,928 WARNING [140576935732992] (ds5-factory.cpp:1152) DS5 group_devices is empty.
08/07 11:55:57,928 WARNING [140576927340288] (ds5-factory.cpp:1152) DS5 group_devices is empty.
08/07 11:55:58,005 ERROR [140575940499200] (librealsense-exception.h:52) acquire: Cannot open '/dev/video6 Last Error: No such file or directory
08/07 11:55:58,005 ERROR [140575940499200] (global_timestamp_reader.cpp:239) Error during time_diff_keeper polling: acquire: Cannot open '/dev/video6 Last Error: No such file or directory
08/07 11:55:58,105 ERROR [140575940499200] (librealsense-exception.h:52) acquire: Cannot open '/dev/video6 Last Error: No such file or directory
08/07 11:55:58,105 ERROR [140575940499200] (global_timestamp_reader.cpp:239) Error during time_diff_keeper polling: acquire: Cannot open '/dev/video6 Last Error: No such file or directory
08/07 11:55:58,108 WARNING [140576960911104] (ds5-factory.cpp:1152) DS5 group_devices is empty.
08/07 11:55:58,111 WARNING [140576935732992] (ds5-factory.cpp:1152) DS5 group_devices is
We can switch to 1 camera in mode 1, and 4 cameras in mode 2. But the emitters are not synchronized either.
As an example, I am deciding to publish an image based on:
emitter_on = f.get_frame_metadata(RS2_FRAME_METADATA_FRAME_EMITTER_MODE) > 0;
The left camera is the master, and the other two are slaves.
I am only displaying the frames tagged with
RS2_FRAME_METADATA_FRAME_EMITTER_MODE > 0
, and as you can see all the cameras are displaying some pattern from another camera.The slaves are displaying the pattern from the master camera, and the master camera is displaying from the slaves.
But this is not deterministic, sometimes the slaves cameras will display the pattern from the other slaves. For instance in the screenshot below, where the master camera has no interference, but the central camera is displaying the pattern from the other two cameras:
We have seen that our master camera may have not been triggering a pulse, maybe due to a faulty connector.
We'll repeat these tests on Monday after validating the individual components.
Please disregard the previous message for now.
Thanks very much @v-lopez for the update. I look forward to your next test results. Good luck!
My last tests were failing due to a defective camera not emitting pulses.
Replacing that camera, we can achieve putting 1 master camera and 3 other cameras in mode 260.
With the Linux Kernel patch, the frame metadata RS2_FRAME_METADATA_FRAME_EMITTER_MODE
indicates whether the frame was captured with out without emitter.
The remaining bit are the timestamps, even though all the frames are captured simultaneously, the timestamps of the images differ.
Are you able to achieve a closer sync between the timestamps of the 3 other cameras if Global Time is set to true, as described at https://github.com/IntelRealSense/librealsense/pull/3909
I am running with Global Time. But I still see time stamp differences of up to 22 milliseconds between time stamps, running at 15Hz.
How long does it take for differences between the timestamps to become noticable? Intel's original multiple camera white-paper document states that if hardware sync is working correctly then timestamps will very slowly drift apart over time, and if they remain constant then it is actually a sign that hardware sync is not working. To quote the paper:
If NO HW sync is enabled, the time stamps will now surprisingly appear to be perfectly aligned. This is because each individual ASIC is counting the exact same number of cycles between frames, and then sending them off. So according to their own time-lines they are sending frames at say, exactly 33.333ms intervals, for a 30fps mode.
By contrast, if HW Sync is enabled, the time stamps will actually drift over time. You might expect this drift to be on the order of less than 1ms/minute. Knowing this, you can actually validate that your sync is working by looking at the drift in the difference of the time stamps. If you see NO DRIFT, then there is NO HW sync. If you see DRIFT, then the units are actually HW synced.
The quoted information can be found at the link below.
https://dev.intelrealsense.com/docs/multiple-depth-cameras-configuration#3-multi-camera-programming
So here are the results from a test.
I started 4 cameras in mode 260. Once they were ready I started a 5th camera with mode 1 at 30Hz.
After running for half an hour this is one metadata snapshot of all the slave cameras taken roughly at the same instant:
{
"frame_number": 104685,
"clock_domain": "global_time",
"frame_timestamp": 1657894800065.5466,
" frame_counter": 104685,
"hw_timestamp": 3554260334,
"sensor_timestamp": 3554256206,
" actual_exposure": 8256,
"gain_level": 35,
"auto_exposure": 1,
"time_of_arrival": 1657894800073,
"backend_timestamp": 1657894800066,
"actual_fps": 60,
"frame_laser_power": 0,
"frame_laser_power_mode": 0,
"exposure_priority": 1,
"exposure_roi_left": 0,
" exposure_roi_right": 639,
"exposure_roi_top": 0,
"exposure_roi_bottom": 359,
"frame_emitter_mode": 0,
"raw_frame_size": 460800,
"gpio_input_data": 0,
"sequence_name": 0,
"sequence_id": 0,
"sequence_size": 0
}
{
"frame_number": 104886,
"clock_domain": "global_time",
"frame_timestamp": 1657894800038.1367,
" frame_counter": 104886,
"hw_timestamp": 3551913717,
"sensor_timestamp": 3551909589,
" actual_exposure": 8256,
"gain_level": 56,
"auto_exposure": 1,
"time_of_arrival": 1657894800040,
"backend_timestamp": 1657894800033,
"actual_fps": 60,
"frame_laser_power": 0,
"frame_laser_power_mode": 0,
"exposure_priority": 1,
"exposure_roi_left": 0,
" exposure_roi_right": 639,
"exposure_roi_top": 0,
"exposure_roi_bottom": 359,
"frame_emitter_mode": 0,
"raw_frame_size": 460800,
"gpio_input_data": 0,
"sequence_name": 0,
"sequence_id": 0,
"sequence_size": 0
}
"frame_number": 104879,
"clock_domain": "global_time",
"frame_timestamp": 1657894800065.515,
" frame_counter": 104879,
"hw_timestamp": 3551418669,
"sensor_timestamp": 3551414541,
" actual_exposure": 8256,
"gain_level": 45,
"auto_exposure": 1,
"time_of_arrival": 1657894800073,
"backend_timestamp": 1657894800066,
"actual_fps": 60,
"frame_laser_power": 0,
"frame_laser_power_mode": 0,
"exposure_priority": 1,
"exposure_roi_left": 0,
" exposure_roi_right": 639,
"exposure_roi_top": 0,
"exposure_roi_bottom": 359,
"frame_emitter_mode": 0,
"raw_frame_size": 460800,
"gpio_input_data": 0,
"sequence_name": 0,
"sequence_id": 0,
"sequence_size": 0
}
{
"frame_number": 104365,
"clock_domain": "global_time",
"frame_timestamp": 1657894800030.8477,
" frame_counter": 104365,
"hw_timestamp": 3554513529,
"sensor_timestamp": 3554509401,
" actual_exposure": 8256,
"gain_level": 41,
"auto_exposure": 1,
"time_of_arrival": 1657894800040,
"backend_timestamp": 1657894800033,
"actual_fps": 60,
"frame_laser_power": 0,
"frame_laser_power_mode": 0,
"exposure_priority": 1,
"exposure_roi_left": 0,
" exposure_roi_right": 639,
"exposure_roi_top": 0,
"exposure_roi_bottom": 359,
"frame_emitter_mode": 0,
"raw_frame_size": 460800,
"gpio_input_data": 250,
"sequence_name": 0,
"sequence_id": 0,
"sequence_size": 0
}
The frame numbers are up to 521 frames apart. And the frame_timestamp up to 35ms apart.
How The frame numbers are up to 521 frames apart. And the frame_timestamp up to 35ms apart
I understand the reason why the timestamps are differing, and this seems to be in line with the 1ms/minute. But how can this be compensated? Our cameras may be running for many hours, even days without shutting down.
If the frame_numbers were kept synchronized, I could find all the images taken in the same instant using the frame_number and work on that, but it seems not to be reliable.
At the link below, a RealSense team member provides advice on how to adjust the frame counter to take account of drifting.
I have also attached a PDF copy of that discussion that can be downloaded in the browser.
Hi @MartyG-RealSense, thanks for that link, it is very helpful.
Unfortunately, I am seeing my FRAME COUNTERS drift.
Cam ID | Counter diff at t=0 | Counter diff at t=300 | Counter diff at t=600 | Counter diff at t=900 | Counter diff at t=1800 |
---|---|---|---|---|---|
Master | 0 | 0 | 0 | 0 | 0 |
S1 | +5 | +6 | +4 | +4 | +0 |
S2 | +3 | -6 | -9 | -14 | -76 |
S3 | +3 | +3 | +3 | +5 | +4 |
S4 | +3 | +1 | +4 | +4 | +3 |
I am not getting any reset due to ESD, because the drift slowly increases over time, and it never goes back to 0. At t=900s the master counter is 26251, and all the others are around it.
I am also starting all the slave cameras first, then starting the master one, so all cameras are ready to start capturing right away.
A RealSense team member suggests at https://github.com/IntelRealSense/realsense-ros/issues/1906#issuecomment-866645537 that if there is a large offset between the FRAME_TIMESTAMP and TIME_OF_ARRIVAL timestamps then the system may be taking time to pass data between the USB host and the RealSense SDK. I believe that this is referring to timestamps on single cameras though and not multiple hardware-synced ones.
It is also advised in https://github.com/IntelRealSense/librealsense/issues/2707#issuecomment-438771890 that the Master should always be set first, and the Slaves set and run after that.
I think the second comment that you mention holds true for Sync Mode = 2. But since we're using 260, the slave cameras wait until there's a pulse signal. And don't trigger anything until then.
Also, the problem is that the drift increases over time. If it was a constant one I could work around it as mentioned in your second to the last comment.
If you are able to adjust the frame counter for individual cameras, perhaps you could periodically use a simple logic check to correct the drift.
For each slave camera's frame count, IF slave frame counter != (does not equal) master frame counter THEN slave frame counter = master frame counter
But the root cause that I am trying to fix is to identify which frames were captured simultaneously and make sure they have the same timestamp.
Each camera is setting a different timestamp even with Global Time, so I cannot use this information. I cannot assume that the last image that arrives frame counter = to the previous master frame counter, because:
Anders_G_intel mentions here:
If the units are HW synced the difference in frame count between different cameras should NEVER drift.
I am seeing a frame count drift on most cameras, and not due to ESD reset.
The Anders section quoted states "If you set frame buffers to minimum, and OS has good driver, and you don't lose frames, then everything should be fine". This poses the possibility of whether the frame counts are not aligned because frames are being dropped.
I believe that if I was losing frames, I would have gaps in the sequence (which might be happening, but is not an issue right now), but as I had understood, the ASIC increments the frame counter regardless of whether the frame is later dropped.
So over time I should not get drifting frame counters.
I have visually compared 2 frames received at a similar time, with a 200-frame counter drift, and I have verified that they were captured at the same instant (they both were facing a stopwatch with 10ms precision, and displayed the same instant).
There are a range of factors in this case that may make achieving the desired result difficult.
Using genlock, a sync system that is experimental and considered "non-validated" by Intel, compared to the original mature and validated non-genlock sync system (Inter Cam Sync Modes 1 and 2).
Using mode 260 to alternate the emitter on and off, a mode that has no available references except for https://github.com/IntelRealSense/librealsense/issues/9528#issuecomment-917048855
Capturing alternating frames where the emitter is completely off or completely on in the saved image is difficult even without multiple camera sync.
A more straightforward approach is to capture images from all attached cameras simultaneously without sync like the C++ script at https://github.com/IntelRealSense/librealsense/issues/2219#issuecomment-412899468 does, though I understand that having images both for emitter on and emitter off is important to your requirements.
Thanks for your time and help @MartyG-RealSense.
I am closing this issue. As the original problem is solved already and is totally different from what we were discussing these last few days.
I'll look into creating such a script or writing our custom librealsense client to have better control of the timings and see if that improves the result.
Best regards.
Thanks very much too @v-lopez for your patience during this case. Good luck!
Issue Description
We have two depth cameras with overlapping fields of view, and we'd like to use inter cam sync mode between them. We also need to use the "Emitter On Off" feature to capture half of the infrared images without the IR pattern. For that we need both cameras synchronized, otherwise, one camera is seeing the pattern of the other.
We have followed https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/RealSense_Multiple_Camera_WhitePaper.pdf and changed the proper settings (one camera as master, the other as slave) but each camera is seeing the pattern of the other. It seems both cameras are triggering frames independently.
Since we were having the issues we tried to go with a simpler approach, using only command line tools, and putting both cameras to Slave mode, with our understanding that no images would be captured, since no one was triggering them as there's no master camera or any device sending a pulse for triggering captures.
With all streams off using
rs-sensor-control
we put BOTH cameras to Slave mode, then we users-distance
on one of the cameras, and we can see valid depth measurements, meaning that the Slave mode is being ignored, as no one should be triggering it.Here's the full list of
rs-sensor-control
options.