Kawasaki-Robotics / khi_robot

ROS KHI robot meta-package
http://wiki.ros.org/khi_robot
BSD 3-Clause "New" or "Revised" License
55 stars 28 forks source link

Input Capture / External Clock Synchronisation #26

Open thomascent opened 5 years ago

thomascent commented 5 years ago

Hi @d-nakamichi,

We would like to precisely synchronise the frames we take using a gripper mounted camera with the joint angle readings we receive from the robot arm.

In general, the way to do this is to use an external clock signal connected directly to both the camera and to the robot and to have them both publish readings on say, the rising edge of the clock signal.

I don't know if something like this is possible with the F30 controller but do you have any ideas? Any input you could provide would be much appreciated!

All the best, Tom

gavanderhoorn commented 5 years ago

@thomascent: what level of synchronisation are you looking for? The scheme that you describe is something that would certainly result in synchronised clocks, but it's also difficult to implement if there is no support for it in the hw (and I doubt a general purpose robot controller can take clock input from an external source: I don't know though, so let's wait what @d-nakamichi writes).

I've done similar things in the past by configuring all involved hosts with an NTP client and making sure clocks are synchronised "often enough". Depending on your needs (fast moving vs semi-static fi) that may or may not be sufficient, but most robot controllers I've worked with support NTP, so it's at least something that is relatively straightforward to set up.

thomascent commented 5 years ago

@gavanderhoorn thanks for the quick reply!

Yeah I think you're probably right and to be fair I'm not aware of any other robot controllers that would support something like this either.

In my case, I'm hoping to synchronise a gripper camera image with a set of joint angles such that I can resolve a super accurate pose for the camera using FK even when the end effector is moving at full speed. I haven't really done the math on this but synchronisation of within a few milliseconds would probably be sufficient for that use case.

We use a realsense camera which has a digital input for an external trigger but otherwise runs over USB. Synchronising the robot controller and our computer via NTP sounds like a good idea, thanks for the heads up!

gavanderhoorn commented 5 years ago

Only @d-nakamichi and his colleagues can comment on whether this is true for Kawasaki controllers and the software in this repository as well, but most industrial robot controllers have quite some delay on the state data that they make available. I've worked with robots where there was 40 ms delay between the actual state of the robot and the data made available to external consumers.

In my case, I'm hoping to synchronise a gripper camera image with a set of joint angles such that I can resolve a super accurate pose for the camera using FK even when the end effector is moving at full speed. I haven't really done the math on this but synchronisation of within a few milliseconds would probably be sufficient for that use case.

Somewhat off-topic, but: if you're only really interested in the Cartesian pose of the EEF (ie: your camera), you could consider using an external tracking solution, perhaps something like vicon or optitrack? I've seen good results with "consumer" grade trackers such as the HTC vive towers. There are ROS packages available for those as well.

thomascent commented 5 years ago

That's actually our current solution! Well, more or less our current solution anyway; we use image alignment to track the camera's pose then estimate the delay between that trajectory and the one from FK. This assumes that the delay is constant though which isn't necessarily the case.

We can also just leave tracking on the whole time but that'll get a little screwed up if the scene has very little no texture. External tracking would be nice too but it would add a whole lot of hardware to our system which we'd rather avoid; it's a good suggestion though.

gavanderhoorn commented 5 years ago

External tracking would be nice too but it would add a whole lot of hardware to our system which we'd rather avoid; it's a good suggestion though.

the vive solution is one marker + two towers. We have them here in the lab. I haven't compared the performance with VO, but I have a feeling it's going to be much more accurate.

d-nakamichi commented 5 years ago

We have AS Language HSENSESET,HSENSE which monitor a specified sensor signal and get a position when the signal got changed. But its function is within AS system. If you want to get a value from external system quickly, it needs TCP or UDP communication with our robot controller. Currently we don't provide its ROS topic interface.