Closed Raulval82 closed 2 years ago
As the External Synchronization paper was formatted as a web-page instead of a PDF, I had the idea of using the Wayback Machine website that archives past versions of internet pages to find an archived copy of the paper and saved it as a PDF file for you. You can download it from the link below.
External Synchronization of Intel® RealSense™ Depth cameras.pdf
Issue Description
Hi,
I am trying to use an encoder signal to acquire the frames at the exact same distance from each other, so I can properly reconstruct a large moving object. The encoder signal has been adapted to 1.8V and more or less 2Hz. As trigger signal I am using a 1ms pulse (I can't have a 100us pulse right now), and for powering the trigger signal I am using the 3.3V pin from the same camera and some resistors.
First, in the RealSense viewer I can see the RGB image moving between depth frames, I mean, the depth frames arrive apparently at time, but between them, I can see the RGB image moving in the fixed Point cloud. Is this how it should work? It is not supposed to take a RGB and depth frame just when the trigger arrives?
Another strange behavior is that it seems to have a delay, the first triggers are not visible immediately. If the trigger signals keep arriving, then it starts to show frames, but not the actual ones, It is like two or three triggers behind.
The last issue that I am facing is that I am receiving two frames for each pulse using the function capture_frame from the Open3D library in Python (The genlock is configured with a 4), and the second frame is not properly aligned between RGB and depth. There is a way of validating the framerate in the viewer? I want to check if the issue is from the camera side or from the o3d library.
Thank you a lot in advance.