Open iamsavva opened 3 years ago
camera.zip when i launch rs_d400_and_t265.launch file , i got error about serial number even i have configured already serial my both camera
serial number in rs_d400_and_t265.launch and both camera's yaml file , you can find my attachment with all configuration. how you have done it ?
camera.zip when i launch rs_d400_and_t265.launch file , i got error about serial number even i have configured already serial my both camera
serial number in rs_d400_and_t265.launch and both camera's yaml file , you can find my attachment with all configuration. how you have done it ?
within ros,rs_d400_and_t265.launch
doesn't require specifying serial numbers; on my end it just runs as long as I plug in both cameras and run roslaunch realsense2_camera rs_d400_and_t265.launch
. Notice that here I am talking about ROS, not ROS2 - you seemed to be using ROS2 in your post.
@iamsavva , yes i am talking about ROS2.
UPD: empirically found that the pointcloud timestamp is delayed by about 350ms. As in: given a pointcloud message with a timestamp T1 from D435 and pose from T265 which is stamped with T1 as well, pointcloud in fact better matches the T265 pose produced at timestamp T2 = T1-350ms. Bearing in mind that T265 pose estimates visually appear to be live, I am inclined to say that D435's timestamping is delayed.
@MartyG-RealSense @doronhi, any suggestions or advice?
Hi, this issue is still present, both with D455 and D435 cameras.
The problem seems to be with latency. These posts and issues discuss D435 camera latency (one, two, three ).
@doronhi, in this issue you said that the timestamps between D435 and the host device should be in sync (which, by the way, what is the "Global Timestamp domain" you reference there?)
The timestamps are not in sync with between different devices for me. I run T265 and D435, grab the timestamp provided in the /d435/depth/color/points/header
, find a T265-supplied tf2 with that same timestamp, and place the pointcloud into the map using that tf2. The map is garbage because the timestamps are not in sync. I have to delay D435 timestamp by 320ms to get visually nice results.
My questions are:
rs_camera.launch
and rs_t265.launch
(with appropriate transforms connecting the two trees).rostopic delay
gives me only 50 ms, which, as i describe above, is inaccurate. Would use pure librealsense and not the ROS wrapper be a better idea to reduce the latency?@MartyG-RealSense @doronhi, I would really appreciate some help with this, I've had this issue for a few months now. Cheers.
Hi @iamsavva , frames in librealsense2 have timestamp domain. Setting the flag "global time enabled" flag sets the timestamp domain to "Global Timestamp domain". This means using the hardware timestamps, the original timestamps given to the frame by the firmware, and synchronizing them with the system timestamp. Thus we avoid momentary jumps caused by the system and still use the host time as a reference.
The "global time enabled" should be set by default (you can look it up in the rqt_reconfigure tool). The realsense2_camera node does not have a queue of frames. A delay in publishing a frame can be caused by processing time but that should not alter the timestamp of that frame.
Notice that the "global timestamp domain" is only available with frame's metadata so make sure you don't have the following message in the realsense2_camera log: "Frame metadata isn't available!"
The only explanation I can think of right now is a fault with the global-timestamp synchronization mechanism. You could test this by turning the flag "global time enabled" off for all the sensors (2 copies - stereo and color for D435 and 1 copy for the T265). I expect a delay between the pointcloud of D435 and the pose of T265 when the flag is off but if indeed there is an issue with the synchronizing mechanism, maybe it will better for you. Please let me know.
When I'll find the time, I would like to create the following test application for testing the delay between 2 devices: The 2 devices will be mounted together and moved by the user from side to side. For the D435, the app will find the image where the direction of the movement is reversing based on a surface tracker. For the T265, the app will find the frame where the direction of the movement is reversing based on the pose data. The app will compare the time of the direction change in both devices. Then it will be possible to compare environments and start debugging. If by any chance you have the extra time to create the said app or any other testing application, it will help a lot.
Hi, I'm running the
rs_d400_and_t265.launch
file with the appropriate mount (D435+T265 cameras), and I noticed in rviz that the pointcloud is consistently lagging behind, please see the video:https://youtu.be/_9eeCFFdWj8
You can see on the video that the TF is responding immediately (which supports my previous experience with the T265 camera), while the pointcloud is lagging behind - even though it's being rotated and moved properly in accordance with the appropriate tf.
This is supported by my experience trying to insert the pointcloud into the grid_map - I mention this because at first I thought that perhaps the issue is with the hardware not being able to draw the pointcloud fast enough, but that's not the case. The resultant maps are unacceptably poor as a result
What are the ways around it, how can I get the true time stamp at which the photo (pointcloud) was taken? Is the problem due to processing the image into the pointcloud (
filters:=pointcloud
results inenable_sync:=true
, so the measurements bunch up, so I wouldn't know if the lag is due to pointcloud processing)? Am I missing some synchronization setting? The lag is substantial and visually noticeable, ~200ms.For reference, I'm running on a TX2, Ubuntu 18.04, RealSense ROS v2.2.17, LibRealSense v2.38.1, D435 firmware is 05.12.02.100.