Closed JeyP4 closed 4 years ago
Any suggestion?
The method you are using to measure the time delay is the most effective (display stop watch acquiring)
The problem in what you are trying to do is that ROS is not reliable about the time delay between publishing and receiving. There are too many variables that make the latency not predictable: network traffic, CPU usage, operating system operations. Furthermore you are using a compressed topic that add processing time due to compression/decompression stages.
You can measure the latency using the timestamp. Each image topic has an header containing the timestamp when the image has been acquired, you can take the timestamp in the subscriber callback (ros::Time::now()
-> http://wiki.ros.org/roscpp/Overview/Time) and make a difference to calculate the real delay.
1) I am first experimenting by just roslaunch zed_wrapper zed_camera.launch
and then in other terminal rostopic delay /zed_node/rgb/image_rect_color/compressed
. This gives delay value of ~0.040sec. No other program is running on ubuntu 16.04.
2) I tried with uncompressed image topic also. And found compression/decompression processing time is not significant. It is of the order of ~0.002sec.
3) rostopic delay xxxx
does the same thing as per its documentation.
Anyways, I also tried creating a subscriber which shows difference of current-time and msg time-stamp. And found result same as rostopic delay
.
I request, if you can test it at your end. I mean the tests of
a) rostopic delay
b) stopwatch acquiring
Do you find both test result matching?
I doubt that time-stamp provided in the image msg, refers to the time when it is published on ROS network. Which is not the time-stamp of when the image was captured. So all the processing time utilized in lens-correction, depth-map computation... is not accounted.(Or some part of it, is not accounted)
Can you remember me the camera parameters that you are using? (Resolution, framerate)
This is obtained with HD720@60FPS
the image is displayed using
rqt_image_view
.
Each test I made reported a delay of about ~65 msec that is correct since the known delay for the ZED camera is about 4-5 frames due to the OS operations on USB
I performed my test for VGA 30 fps Sensing mode true Depth mode performance
The results you obtained are very good. I have never seen these results. Even I tried without ROS also, using SDK tools. There also delay was 200 ms.(measured by screen recording). I am not able to conclude this issue.
What are the configurations of system used for your testing.
What do you mean with "Sensing mode true"? If you are using "SENSING_MODE_FILL" than you are adding latency due to depth processing to "fill the holes" and this can generate latency. This mode is good for AR/VR applications, but not for robotics
I was trying Sensing mode fill. Let me revert and try tomorrow.
Sorry for my late reply, system availability issues.:disappointed: Lets come to the topic: I tried many ways to assess time-delay presence.
ZED explorer
To solve this problem, better to use a universal way.
In the tools provided in ZED SDK. I used 'ZED explorer' at VGA 30fps. And performed 'Camera facing stopwatch test'. Time-delays observed in the recorded video are 100-150 msec. My screen refreshes @60fps and I recorded desktop @60fps.
roslaunch zed_wrapper zed_camera.launch
zed.yaml - default configurations with VGA@30 and sensing mode: standard
I performed 'Camera facing stopwatch test' and measured delay with rostopic delay /zed_node/rgb/image_rect_color
. Time-delays observed in the recorded video is constant 32 msec.
I am wondering ZED explorer should indicate less time-delays because it doesn't carry any ROS layer. Moreover if I perform 'Camera facing stopwatch test' on ROS images, e.g. on /zed_node/rgb/image_rect_color
. Time-delays are 100-150 msec.
Do I have to modify any source file in the wrapper to obtain actual timestamp? Btw, I am using SDK 2.8.
Thanks for consideration:wink:
The result that I reported above were obtained with SDK v2.8 and the latest version of the ZED wrapper available in the master
branch, without any further modification.
I connect the camera directly to the USB port of the laptop, without using any USB3 hub.
Thanks for kind co-operation. I am also connecting ZED to my PC USB3 port directly. I think I have to live with this problem.
As one last try, I see in SDK2.8 release notes, something related to getTimestamp(TIME_REFERENCE_IMAGE)
.
Do you think any modification in src files may result correct actual time-stamp?
Hi @JeyP4, sorry for the late reply, I missed your last question.
Here you can find the documentation for the "getTimestamp" function: https://www.stereolabs.com/docs/api/classsl_1_1Camera.html#aeed4f708d46d12dd6b3c175ad076ba93
As you can read you have two options: TIME_REFERENCE_IMAGE and TIME_REFERENCE_CURRENT. There is a third option: "ros::Time::now()", that is similar to use "TIME_REFERENCE_CURRENT", but I'm not really sure that they match precisely.
We chose the "TIME_REFERENCE_IMAGE" because it's the "real" time when the image has been received by the system, you can use the other two solutions if you prefer and if they better match your needing.
Hi @JeyP4 can this issue be closed?
Closed for inactivity
Hi I am trying to obtain time-delay between when the image was shot and when image is received by a ros-subscriber. I waited for new SDK also after initiating same question here.
But still
rostopic delay /zed_node/rgb/image_rect_color/compressed
gives delay value of the order of ~0.040sec. And when I measure actual time-delay in image by starting a stopwatch and pointing ZED on stopwatch and recording screen. I observe delay ~(0.100-0.250) sec. This is highly variable and very big also. ZED captured image frame is displayed by using RVIZ. I am working on a delayed control algorithm and want to measure time-delay in image transmission for effective control. Right now can't figure out how to measure!I am attaching the video of screen recording to explain issue better.
https://youtu.be/5jBjnbvuyX4
Nvidia Geforce GTX 1050 Ti
i7 6cores
128 GB RAM
Screen 60 fps