moveit / moveit_calibration

Hand-eye calibration tools for robot arms.
BSD 3-Clause "New" or "Revised" License
131 stars 72 forks source link

solution is way off #60

Open christian-rauch opened 3 years ago

christian-rauch commented 3 years ago

Setup

I am trying to calibrate in an eye-in-hand setting, where the "hand" is actually the head of a fixed-based robot with a top-mounted camera. I am looking for the extrinsic calibration between the head and the camera mount point.

The kinematic chain is as follows: base → torso → head → camera_mount → optical-frame

The base → torso → head is given by the joint states, camera_mount → optical-frame is a static transform given by the camera models (Intel RealSense in this case). I am looking for the transformation head → camera_mount, which I can get via head → optical-frame (minus the known static transform __camera_mount → optical-frame__).

Edit: From https://github.com/ros-planning/moveit_calibration/pull/58#issuecomment-762422033, it is actually possible to find head → camera_mount directly by setting the camera_mount as the "Sensor frame".

Calibration

This figure shows the setup with an initial guess: rviz_screenshot_2021_01_18-13_56_06

$ rosrun tf tf_echo HEAD_JOINT1_Link nextagea_realsense_color_optical_frame
At time 1610978311.120
- Translation: [0.021, 0.033, 0.165]
- Rotation: in Quaternion [-0.558, 0.558, -0.435, 0.435]
            in RPY (radian) [-1.817, 0.000, -1.571]
            in RPY (degree) [-104.106, 0.005, -90.001]

After collecting samples and running the solver, the plugin provides the "solution": rviz_screenshot_2021_01_18-14_01_03

$ rosrun tf tf_echo HEAD_JOINT1_Link nextagea_realsense_color_optical_frame
At time 1610978514.896
- Translation: [-0.256, 0.124, 0.528]
- Rotation: in Quaternion [0.695, -0.424, 0.432, -0.387]
            in RPY (radian) [-1.917, -0.275, -1.290]
            in RPY (degree) [-109.850, -15.774, -73.896]

Is this an issue with the solver or is the result interpreted incorrectly?

JStech commented 3 years ago

How many degrees of freedom does the head have? The calibration is only solveable if there are rotations around at least two axes. I'm not sure from the images you posted if the head just nods or also rotates.

Besides that, it's hard to debug a bad calibration. The initial result after 5 or 6 poses can be pretty bad, but then improve as you add more samples (up to about 15). It's important that you have good camera intrinsics, although the RealSense factory calibrations have been pretty good, in my experience, so that's probably not your issue.

christian-rauch commented 3 years ago

How many degrees of freedom does the head have?

The head has 2DoF but the whole chain base → torso → head has 3DoF in total, the torsor only rotates about z. (I am not showing all frames in the images.)

Besides that, it's hard to debug a bad calibration.

In did the calibration multiple times with different samples in that setup and always go the same result. Is there a way to run the solver manually (without RViz) and maybe inspect the solver result and other result flags?

JStech commented 3 years ago

Sorry for such a late reply. You can export the samples using the "Save samples" button on the control tab. You can then import them into Python and run the solver directly.

christian-rauch commented 3 years ago

Yes, I saw that the solver is just a python script that is executed within the C++ code. Is there a script or example which can use the exported data to debug the solver?

JStech commented 3 years ago

So, I recently just did this, but I can't find my script--it might've just been in the interpreter. Here's a rough sketch:


import yaml
import handeye
import numpy as np

calibrator = handeye.calibrator.HandEyeCalibrator()

samples = yaml.load(open('samples.yaml'))
for sample in samples:
  T_eef = np.array(sample['effector_wrt_world']).reshape((4, 4))
  T_cam = np.array(sample['object_wrt_sensor']).reshape((4, 4))
  calibrator.add_sample(T_eef, T_cam)

calibrator.solve()
tejaswid commented 3 years ago

Not sure if it is due to the same thing but I had the same issue and found that the camera intrinsic parameters are not set correctly, even though I set the correct camera_info topic in the GUI. I hard coded them and everything worked like a charm. I will open a new bug report for this.

JStech commented 3 years ago

@tejaswid Do you mean the CameraInfo messages had the correct intrinsics, but somehow they weren't loaded correctly within MoveIt Calibration?

tejaswid commented 3 years ago

Yes that is correct.

tejaswid commented 3 years ago

The incorrect intrinsics were caused due to empty distortion parameters in the ROS CameraInfo message. You can view the discussion here.