andyzeng / visual-pushing-grasping

Train robotic agents to learn to plan pushing and grasping actions for manipulation with deep reinforcement learning.
http://vpg.cs.princeton.edu/
BSD 2-Clause "Simplified" License
880 stars 314 forks source link

Depth Camera Calibration #13

Open jundengdeng opened 5 years ago

jundengdeng commented 5 years ago

Hi Andy,

Thanks for sharing the code. I'm quite interested the depth camera calibration process, can you please share more details on the calibration process? For example,

  1. How did you get the workspace limit in robot coordinates?
  2. How did you get the checkerboard offset from tool?
  3. Where can I download the checkerboard picture?

Thanks in advance!

Best,

Jun

nro-bot commented 5 years ago

I'm not sure this is correct, but:

  1. Using the pendant, the limits are the X, Y, Z as displayed under the "TCP" box (it is displayed in mm; the code is in meters). e.g.

    [[0.4, 0.75], [-0.25, 0.15], [-0.2 + 0.4, -0.1 + 0.4]])  [1]
    [minx, max x], [miny, max y], [minz, max z]
  2. This is also just experimentally measured. I'm least certain on this part, but I think it is what the tool would need to do to move to the checkerboard center. So if it needs to move +20cm X - 0.01cm Z to the center of the checkerboard. Presumably the tool center = the middle area of the gripper fingers.

EDIT: Wow not sure what I was thinking, but it's to the "tool center" of the robot (what is reported on the pendant / over TCP from the UR). And as to the sign of the offset -- it's really checkerboard_pos = tool_pos + offset, so define the offset appropriately. Well, that's my current belief based on inspecting the code, but maybe I will update the belief tomorrow, who knows. end edit

The readme implies this calibration isn't so important if you're using the Intel D415 realsense. For what it's worth the format of the files is (ignore the actual values)

EDIT: Yup, changed my mind. The calibration actual provides the pose of the camera relative to the robot frame. In this way, the image from the camera, which may be looking at the workspace from the side or at an angle, can be morphed/transformed so that the image is from a perfectly "birds eye" camera. end edit

--

Also, for starting out, a blank file named camera_depth_scale.txt will suffice to kill errors preventing code run.

real/camera_depth_scale.txt
1.012695312500000000e+00
real/camera_pose.txt
9.968040993643140224e-01 -1.695732684590832429e-02 -7.806431039047095899e-02 6.748152280106306522e-01
5.533242197034894325e-03 -9.602075096454146808e-01 2.792327374276499796e-01 -3.416026459607500732e-01
-7.969297786685919371e-02 -2.787722860809356273e-01 -9.570449528584960008e-01 6.668261082482905833e-01
0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.000000000000000000e+00
  1. Any 4x4 checkerboard will work. I used some online checkerboard generator and then printed it out. e.g. here is one 4x4

[1] Note that it's possible the pendant display somehow differs from the actual TCP values -- my z-values were 0.07 on the pendant corresponding to 0.47 in python; to debug, can use examples/simple.py https://github.com/SintefManufacturing/python-urx

Guptajakala commented 3 years ago

@nouyang

Hi, as for what you mentioned: Wow not sure what I was thinking, but it's to the "tool center" of the robot (what is reported on the pendant / over TCP from the UR). And as to the sign of the offset -- it's really checkerboard_pos = tool_pos + offset, so define the offset appropriately. Well, that's my current belief based on inspecting the code, but maybe I will update the belief tomorrow, who knows. end edit

Do you have any idea how this offset between the marker and the tool center is measured? The marker is on the surface while the tool center is inside.

nro-bot commented 3 years ago

@Guptajakala I think if you look at the source code it is just the x or y axis offset only (whatever axis the robot must move along to get between the tool and offset), so looking from above, the 1D distance.

Guptajakala commented 3 years ago

@nouyang Nice, thank you so much!