Open Eric-nguyen1402 opened 1 month ago
Hi @Eric-nguyen1402 ,
I can't really tell what the units are on your graph. How much delay are you experiencing between frames? Are you experiencing the same thing with the ros_kortex_vision package?
The purpose of the graph is to verify that the timestamps of the messages are close enough to be synchronized (nanosec). Here I try to adjust the slop value in the ApproximateTimeSynchronizer
to check if the time difference is within the acceptable threshold(slop value). For instance, when I set the slop to 0.1 seconds, it worked twice but then stopped functioning correctly.
Unfortunately, I haven't experienced with the ros_kortex_vision package cause we are using ros2 now.
Given that the ros2_kortex_vision package is not maintained by Kinova, there is little I can do about this. I will say that 0.1 second of delay between the two frames is within expected values for a Gen3. If you just open the rtsp stream (e.g. with VLC with rtsp://192.168.1.10/leftir), you will see that the video input itself has a lot of lag.
I suggest submitting your issue to the ros2_kortex_vision repo.
Thanks for your suggestion but it looks like the ros2_kortex_vision repo has disable the issues. That's why I open the new issue here to ask for help.
Then the only thing I can add is that most of the customers I heard from who are using vision are mapping RGB to depth on captures while the robot is immobile - in these cases, the delay does not matter. I'm sure some of them must map the streams, but I don't know how they do it.
I will leave this issue open, just in case one such user comes across it and can give you an answer.
Dear @martinleroux,
I had the same issues with these two topics, /camera/depth_register/image_rect and /camera/color/image_raw. Streaming two topics from the robot is too unstable. This makes me wonder that it does not make sense to use the attached camera with KINOVA. I regret buying the KINOVA vision version when this utility in ROS 2 is so unstable.
I hope KINOVA can step in and implement a stable ros2 vision version
Best regards,
Hi @peterminh227 , I understand your situation. I have already added taking a second look at the vision driver to my team's todo list. As I highlighted earlier, the current driver was not developed and is not maintained by Kinova.
Out of curiosity, can you tell me what is the output of htop while you are trying to operate vision? An unrelated investigation on our side recently revealed that ROS2 requires much more processing power under certain circumstances (the specifics of which we have yet to pin down) than what we expected. We've had PCs that could run ROS1 applications just fine but simply could not handle ROS2.
Hello @Eric-nguyen1402 ,
I have used the ros2_kortex_vision github repository with a Gen3 on my end and here are my findings: First, i have created a node to subscribe to both color and depth streams (i.e. "/camera/color/image_raw" and "/camera/depth_registered/image_rect" topics) and calculate the timestamp differences among the received messages and report the average of each 100 readings and here are the results:
Also, here is a plot showing the differential readings in both seconds and nanoseconds using plotJuggler:
As you can see, the values are mostly well below 0.1 seconds
Second, i have created a synchronization node using a Synchronizer with a an approximate time policy (slop= 0.1 and queue size = 10) and it is working well although i am sure there will be some lost out of sync messages. Please let me know if you need further assistance.
Cheers, Abed
Dear @aalmrad
Thank you for your findings. I actually tried a similar approach, but instead of creating a new node, I opted to use the message_filters library, specifically the message_filters.ApproximateTimeSynchronizer
or message_filters.TimeSynchronizer
. These functions only output when they have received messages from all specified sources with matching timestamps, eliminating the need to manually compute or predict the timestamps which is the same method like you.
I also used message_filters.ApproximateTimeSynchronizer with a slop of 0.1 and a queue size similar to yours, but it only worked in a few cases and was too unstable for the project. Therefore, I believe the core issue may not lie in how we synchronize the topics.
Best regards, Eric
Hello @Eric-nguyen1402 ,
Please try my method and let me know if you still face any issues. If this still does not work, we can proceed dig deeper into the problem.
Sincerely, Abed
Hi I am using the Gen 3 Lite Robot 7DoF with the Realsense D410 camera. For my project, I need RGBD images from the camera, which I access using the ros2_kortex_vision package. Based on the GitHub documentation, I assumed that the /camera/depth_register/image_rect topic is aligned with the /camera/color/image_raw topic. However, when I tested this using plotjuggler, it showed a significant time difference between these two topics, even after alignment.
I believe this time gap is why the
message_filters.ApproximateTimeSynchronizer
ormessage_filters.TimeSynchronizer
functions fail to synchronize the messages and trigger the callback function. Interestingly, in some tests, it worked briefly, which makes me think the issue isn’t with my code, but rather with these two topics being out of sync.Do you have any suggestions on how to fix this issue?