Jmeyer1292 / robot_cal_tools

A suite of tools focused on calibration of sensors for robotic workcell development
Apache License 2.0
140 stars 40 forks source link

Advice on calibration #75

Closed johntraynor closed 4 years ago

johntraynor commented 4 years ago

Hi guys,

With the great help from Chris I managed to get a calibration using static_camera_example.launch / static_camera_extrinsic.cpp setup in examples. I am conscious of taking up too much of Chris's time so hopefully some one can point me in the right direction on this problem I am having

Now the problem I am facing is the following

So a few questions

Just some notes

Any help with this would be greatly appreciated

Thanks JT

Non skew - these type of images and poses work although the calibration is not the most accurate image

Skewed - these type are really throwing the calibration off image

marip8 commented 4 years ago
  • I can get an OK calibration (+/- 10mm in most of the axes & < 3px per dot) when I use images & poses which are not that skewed

Depending on the parameters of your camera, I would expect the final residual error to be less than half a pixel. What is the resolution of your camera? Also, where did you get the +/- 10mm number? Do you know the actual location of your camera?

  • Has anyone done a calibration with this example?

I haven't done a static camera example in a while, but we will be working on one today as an example of the ChAruCo target detection that @jaberkebile has been working on. We'll upload the example data soon (by next Wednesday at the latest) and use the static camera example to perform the calibration. We'll check and make sure it's working properly

  • I've checked all the images and all the correspondences are being collected properly

It is very important that the correspondences are assigned correctly, so it's good to hear you've checked these. Just FYI, for the circle grid targets the pattern always starts in the top left corner (when looking at the target with the large circle on the bottom row on the left) and works its way across the columns in each row until reaching the bottom right circle. The largest circle should always be labeled as the origin

  • Any ideas what could be wrong. Why does introducing angled imaged and poses into the same code produce such bad results? (it should make it better)
  1. Make sure all of the correspondences are actually identified correctly
    • It sounds like you are already doing this
  2. Try out the recently-added calibration validation features
    • Homography validation
      • This tool should verify that your correspondences are correct
      • Example
    • Camera Noise Measurement
      • This tool measures the variability in pose estimates from a target observation with an intrinsically calibrated camera. The idea here is that you take the same picture N times (without moving the camera and target), run this analysis, and see how much noise shows up in the estimates. This should give you a better idea of how precisely the sensor you are using can measure the target when nothing has changed between images.
      • Example
    • Covariance Analysis
      • The hand-eye calibration struct now returns an object describing the covariance of the calibration optimization. You could print the covariance results and see if there are any very high correlations between variables. My guess is that will all be relatively low (< 0.7), so this may not be too helpful

I think the most straightforward thing we can do to help is to write a few additional unit tests here and here that specifically test the static-camera-moving-target use-case. Perusing through the unit tests we have, it seems like we have actually only written tests for the moving-camera-static-target use-case. There is a possibility that the math is not totally correct. If you want to try to tackle this, it would be much appreciated. Otherwise, I can create the tests over the next couple of days as I find time

drchrislewis commented 4 years ago

@johntraynor @marip8 Three things. First, the expected residual for eye-hand calibration is much higher than for intrinsic calibration mainly because the robot kinematics themselves are inaccurate. Next, it should make no difference whether the target is mounted on the robot or the camera is mounted on the robot. The math is exactly the same. Here, I'm not referring to robo_cal_tools, but to the calibration library. In that code, the transform from the target mount to the camera mount frames are retrieved from TF. Next, I always suspect initial conditions when using robo_cal_tools since these are not obtained from a URDF model that looks approximately correct. Next, your robot calibration may be way off. If so, this will be more pronounced when tilt angles are greater. Sorry I've not had time to look at your images. I'm swamped this week.

johntraynor commented 4 years ago

Hi guys,

Much appreciate the inputs.

Quick question - could bad initial guesses work for the relatively flat poses but go astray for the steep angles ones?? It would seem strange that I can get a decent calibration with these initial guesses but once I throw the skewed images in, it all goes awry!.

Chris - absolutely no problem on the delayed responses. I'm only too delighted when you do respond as it helps me better understand this whole area.

johntraynor commented 4 years ago

Hi guys,

I was away for a few days so just catching up. Any luck in running the new ChAruCo target software with the static camera tool example?

Thanks in advance

marip8 commented 4 years ago

@johntraynor we finished the calibration and put together an example of static-camera-moving-target extrinsic calibration in #73. For reference, we used a 2 megapixel AVT camera with 6mm lens. The optimization converges with a final squared cost of 1.3 px^2 (1.14 pixels per feature) which is reasonably good. This was also using raw, non-rectified images.

Quick question - could bad initial guesses work for the relatively flat poses but go astray for the steep angles ones? It would seem strange that I can get a decent calibration with these initial guesses but once I throw the skewed images in, it all goes awry!.

Several of the images that we took for this calibration had significant skew, and the optimization still worked correctly. You should take a look at the images uploaded for the example and see how they compare to your skewed images. Something doesn't quite add up with the calibration working for straight-on views but not for skewed ones. It should succeed in both cases and give you very similar results; I would think that adding skewed images could actually help improve the final result.

I would take a look at the example in my fork (cleaned up version of #73 which I'll push onto that PR soon) and see if it looks similar to the way you're constructing your calibration.

One other thing I might mention is that the poses you collect with the images must very accurately represent the configuration of your hardware. I've run into problems before where a previously unnoticed mistake in the URDF model causes a discrepancy between the robot controller reported tool flange transform and the TF reported tool flange transform, resulting in a bad calibration. You might double check that your ROS models and ROS-reported transforms are accurate.

johntraynor commented 4 years ago

Really appreciate the feedback and data. Has to be something wrong with our poses. I'll run the code with you data set just to make sure it all works well before we look at our data again

One question - We are getting our poses direct from a UR robot over a TCP socket connection through a small robot program we wrote. We are not using ROS to collect the poses so I'm suspecting this be where it is all going wrong? Basically we open a socket to the robot, then request the pose using This function

get_forward_kin()

which basically sends us back the RX, RY, RZ, x, y, z values of the robot and we then save the image at the same time we save the poses. Is this totally the wrong way of getting the correct pose of the robot?

marip8 commented 4 years ago

You don't necessarily need to collect pose data from ROS (although there is a good, well-supported ROS driver for UR). Conceptually there should be no reason why collecting data the way you are now wouldn't work.

One tricky aspect is making sure you interpret the Rx/Ry/Rz values correctly. I'm not sure if those are Euler angles (and if they are, in which order they are applied: i.e. XYZ, ZYX, etc.) or scaled axis angle rotations, etc. If you interpret these incorrectly, then that would definitely wreck your calibration results.

@drchrislewis has suggested creating a utility to display the poses in Rviz so you can verify that your robot pose really looked like that before running the calibration. That would be pretty simple to do. Alternatively you could overlay the expected origin of the target onto each image you acquired by projecting the target origin into the image plane using your initial target/camera guesses. If the origin appears to be pretty close to where you expected it to be, then you would have confidence that the data you collected was valid

One benefit of using ROS for calibration data collection is that rct_ros_tools has a pretty simple utility for getting poses and images and saving them correctly into a data structure that can be easily loaded into a calibration. If you have some experience with ROS, it would probably be worth giving this a shot. Basically you would run the UR ROS driver, move the robot around to the calibration positions (manually with the teach pendant or in an automated way), and use the services defined in this node to collect and save the data

gavanderhoorn commented 4 years ago

One tricky aspect is making sure you interpret the Rx/Ry/Rz values correctly. I'm not sure if those are Euler angles (and if they are, in which order they are applied: i.e. XYZ, ZYX, etc.) or scaled axis angle rotations, etc. If you interpret these incorrectly, then that would definitely wreck your calibration results.

Yes. UR by default on the TP fi uses axis-angle. I haven't checked what get_forward_kin() returns, but you really want to make sure you interpret it correctly, as otherwise things-won't-work.

You don't necessarily need to collect pose data from ROS (although there is a good, well-supported ROS driver for UR).

@johntraynor: if you'd like to verify whether your custom script is the cause: the driver can be used to just read data from the robot (such as the current pose) without requiring any setup on the controller itself.

Only when motion should be controlled by the ROS side would you need to install the URCap.

If you do try out the driver, be sure to follow the steps to extract the calibration data, to make sure your URDF corresponds to your particular robot.

Additionally/alternatively: the tool0_controller frame broadcast by the driver corresponds to the EEF (+ toolframe) position the controller broadcasts itself.

johntraynor commented 4 years ago

Hi guys, some great help and feedback here so I have lots to go on. I’m pretty convinced it’s the rotation we are inputting in in the wrong format. Can i just get someone to clarify. Does the calibration expect the rotation as roll pitch yaw and not Euler angles prior to conversion to Quaternion? Thanks

marip8 commented 4 years ago

Can i just get someone to clarify. Does the calibration expect the rotation as roll pitch yaw and not Euler angles prior to conversion to Quaternion?

Technically, the pose input can be input in any form as long as the user converts it to a 4x4 transformation matrix (Eigen::Isometry3d). All of the calibration-related classes/structures use this data type to represent a transformation. Eigen has a variety of functions for converting Euler angles, RPY, axis angle rotations, etc. to this type of transformation. There's nothing to stop you from uploading your orientation format and converting it to a Eigen::Isometry3d

If you want to use the YAML loaders that we've created in rct_ros_tools, then the YAML files need to have orientation in quaternion form

johntraynor commented 4 years ago

OK that makes total sense

I have been using this code to convert the angle axis from UR to quaternion before I save to the YAML files but I don't think it's correct.

(x, y & z below are the angles axis from the UR robot)

Eigen::Quaterniond q; q = AngleAxisd(x, Vector3d::UnitX())

marip8 commented 4 years ago

I'm not totally sure. What you posted above is really equivalent to Euler angles applied XYZ. Sometimes axis angle has 4 values (angle and x/y/z components of the axis vector), and sometimes it has 3 values where the angle is the magnitude of the values and the axis is normalized vector with those values, like this:

Eigen::Vector3d rotation(x, y, z);
double angle = rotation.norm();
Eigen::Vector3d axis = rotation.normalized();
Eigen::Quaterniond q(Eigen::AngleAxisd(angle, axis));

This document on the UR website seems to indicate that the UR controller uses the latter, 3-value method

gavanderhoorn commented 4 years ago

Or just use the driver ;)

johntraynor commented 4 years ago

Lesson learnt lads - I thought my approach would be quicker just to get up and going but I was totally wrong. The only good thing is I have learnt alot. Thanks for all your patience.

gavanderhoorn commented 4 years ago

I was kidding though: it's of course fine to do whatever has your preference.

johntraynor commented 4 years ago

no you are dead right - I should have gone with the driver in the first place - I would have saved myself and you guys a lot of hassle

Just refactored in the new code that Michael gave me on a small set of 4 images / poses and the calibration results look good. I'll do a proper run tomorrow and hopefully we can close out this issue. Thanks again all

johntraynor commented 4 years ago

OK lads- just to confirm the problem was how we were saving the robot pose to the yaml files. We ran a new calibration with 15 sets and everything looks good. Thanks for all the help again. I've no doubt i'll be on again with more questions!

DavidMerzJr commented 4 years ago

@johntraynor Great to hear. We are pretty swamped these past few weeks, and can't be very helpful as a result.

johntraynor commented 4 years ago

Thanks again - great set of tools

marip8 commented 4 years ago

Good to hear that this resolved your issue. Feel free to open other issues as you encounter them, and we'll do what we can to support