IFL-CAMP / easy_handeye

Automated, hardware-independent Hand-Eye Calibration
Other
847 stars 217 forks source link

Kinova camera calibration #19

Closed GianlucaCerilli closed 5 years ago

GianlucaCerilli commented 5 years ago

Hello,

I am on Ubuntu 16.04 with ros kinetic and I am using a Kinova Jaco2 arm with a monocular camera mounted on the end effector. Is it possible to use this package for the camera calibration? Is it also possible to move the robot for the calibration without using a simulator?

Thank you very much

marcoesposito1988 commented 5 years ago

Hi Gianluca,

yes, that should be possible. You can use kinova-ros to move the robot with MoveIt. You will need to find a ROS driver for the camera, and to perform the intrinsic calibration. Once that is done, you can use the camera to track AR markers with a package like aruco, so you can print out an AR marker that you generate online. You will have to provide the marker ID and size to aruco for it to work (check the actual dimension of the printed marker with a caliber! the printer might change it). Finally you can perform the hand-eye calibration in the eye-on-hand case as reported in the docs of this package. Then you can use the publisher to have the computed calibration published in tf while running your stuff.

GianlucaCerilli commented 5 years ago

Hi @marcoesposito1988,

Thank you for your reply. There are some things that I am not sure about the visp calibrator, maybe you can kindly help me.

When you use AR marker (do you need only one or more?) as object for camera calibration, you put it in a known location wrt the world frame (that is, the base of the robot). But what is the point of the marker (like the origin of the object frame) that you consider to measure its position wrt to the world frame? Do you also have to specify it in the code?

In the visp_hand2eye _calibrator you have to feed the camera-object transformations. But how can you get them since you don't know the pose of the camera wrt the object?

Finally, is it also necessary to change some file or parameters in the code (like the ros topics for the camera and end-effector) to run the calibration?

Thank you very much and sorry for the many questions! Best regards

marcoesposito1988 commented 5 years ago

That is the beauty of the hand-eye calibration algorithm: the location of the AR marker is not known, it just needs to be constant for all the samples. You don't need to change any code, you just have to pass the needed information through the launch file arguments as specified in the docs.

The algorithm will build a system of equations that assume that the transformation between the base of the robot and the marker is constant. This means that the AR marker doesn't move, but also that the tracking is reliable. So you should take care that it is extremely flat (you can glue it to a metallic plate, for example); that the lighting is good; that you acquire samples only when the robot has stopped moving and the tracking is stable; and that you never place the camera so that it looks straight to the marker (because the tracking gets unstable when the marker appears as a symmetric square).

GianlucaCerilli commented 5 years ago

Hi @marcoesposito1988 ,

Thank you for explanation, now I understand and this is very convenient. But there is one more thing that is not really clear to me. I intrinsically calibrate the camera. Now, for the tracking I use the Aruco marker (I guess "original ArUco" under the dictionary voice) and I have the end-effector with the gripper and I mount the camera on it (in a chosen point on the hand, without removing the gripper). So, I am not understanding what are the marker_frame, ref_frame and camera_frame to insert in my case. I don't think I have a camera frame, for example, since the robot URDF is without the camera.

This is what I have written for the ArUco part (without considering the three fields that I mentioned to you):

<include file="$(find pylon_camera)/launch/pylon_camera_node.launch" >
</include>

<!-- start ArUco -->
<node name="aruco_tracker" pkg="aruco_ros" type="single">
    <remap from="/camera_info" to="/pylon_camera_node/camera_info" />
    <remap from="/image" to="/pylon_camera_node/image_raw" />
    <param name="marker_size"        value="$(arg marker_size)"/>
    <param name="marker_id"          value="$(arg marker_id)"/>
    <param name="reference_frame"    value="camera_link"/>
    <param name="camera_frame"       value="camera_rgb_optical_frame"/>
    <param name="marker_frame"       value="camera_marker" />
</node>

Thank you very much again

marcoesposito1988 commented 5 years ago

To inspect the tf frames that are present in your system you can use the tf module of rviz, or also rqt_tf_tree. Aruco should publish on tf the transform between the camera (and so the camera_frame) and the marker (marker_frame).

ref_frame is only needed in the specific case when you need to calibrate with respect to another reference frame, for example when you have to calibrate a kinect. In that case aruco will give you the transform from the optical centre of the RGB camera, but you want to calibrate with respect to the root of the kinect tf tree. In your case you should be able to leave it empty or to make it equal to camera_frame.

GianlucaCerilli commented 5 years ago

Hi @marcoesposito1988 ,

I am going to mount the camera on the arm tomorrow or in two days. In the meantime I wrote the launch file to see how it worked but I have some errors. I successfully track aruco, If I move the arm I can see it moving in the simulator, but I don't know if these errors can be important for the calibration purpose. Maybe you know it. I write here my launch file and some pics of what I see when I launch it (the camera is connected but is still not mounted on the arm).

Launch file:

Bash errors

screenshot from 2018-11-13 17-59-47 screenshot from 2018-11-13 18-00-38

rviz_easy_handeye.config:

screenshot from 2018-11-13 17-46-32

moveit.rviz:

screenshot from 2018-11-13 17-45-31

rqt_easy_handeye.perspective:

screenshot from 2018-11-13 17-46-56

Thank you very much and sorry again for bothering you! I hope these errors do not compromise the calibration process so that I do not ask you for any other clarification.

meiqua commented 5 years ago

@GianlucaCerilli This error is from move group name. See calibrate.launch line 25

marcoesposito1988 commented 5 years ago

@meiqua is correct (many thanks for your support!). The group name should be "arm" (you can see it in the planning request/planning group section of the rviz Motion Plugin parameters tree).

cygreat7 commented 5 years ago

@GianlucaCerilli此错误来自移动组名称。请参见calibrate.launch第25行

Excuse me, after I run the launch file, each gui has come out, but I can't get the frame of the aruco marker and it appears Cannot calibrate from current position. Thank you very much for your reply.

marcoesposito1988 commented 5 years ago

@GianlucaCerilli此错误来自移动组名称。请参见calibrate.launch第25行

Excuse me, after I run the launch file, each gui has come out, but I can't get the frame of the aruco marker and it appears Cannot calibrate from current position. Thank you very much for your reply.

This happens when the starting position of the robot is bad. During the calibration, the robot hand should rotate once about each axis and translate in each direction, and go back to the starting pose in between. The automatic movement plugin tries to find a trajectory to each of these positions in advance, before starting the procedure (to avoid an interrupt after taking many poses, which would be extremely frustrating). However, if at least one pose cannot be reached from the starting position without moving too much, the plugin aborts with this error.

Please try again, positioning the robot in such a way that it can move in all directions without hitting the joint limits.

cygreat7 commented 5 years ago

@GianlucaCerilli此错误来自移动组名称。请参见calibrate.launch第25行

Excuse me, after I run the launch file, each gui has come out, but I can't get the frame of the aruco marker and it appears Cannot calibrate from current position. Thank you very much for your reply.

This happens when the starting position of the robot is bad. During the calibration, the robot hand should rotate once about each axis and translate in each direction, and go back to the starting pose in between. The automatic movement plugin tries to find a trajectory to each of these positions in advance, before starting the procedure (to avoid an interrupt after taking many poses, which would be extremely frustrating). However, if at least one pose cannot be reached from the starting position without moving too much, the plugin aborts with this error.

Please try again, positioning the robot in such a way that it can move in all directions without hitting the joint limits.

Thank you very much and sorry again for bothering I follow what you said, move jaco through moveit,When I click check strating pose,bash errors: [ERROR] [1544750813.024052931]: arm[RRTConnectkConfigDefault]: Unable to sample any valid states for goal tree when i click next pose ,it appears [ INFO] [1544750919.967655581]: Planning request received for MoveGroup action. Forwarding to planning pipeline. [ INFO] [1544750919.968248251]: No optimization objective specified, defaulting to PathLengthOptimizationObjective [ INFO] [1544750919.968339246]: Planner configuration 'arm[RRTConnectkConfigDefault]' will use planner 'geometric::RRTConnect'. Additional configuration parameters will be set when the planner is constructed. [ INFO] [1544750919.968573415]: arm[RRTConnectkConfigDefault]: Starting planning with 1 states already in datastructure [ INFO] [1544750920.004478533]: arm[RRTConnectkConfigDefault]: Created 5 states (2 start + 3 goal) [ INFO] [1544750920.004562905]: Solution found in 0.036118 seconds [ INFO] [1544750920.008613306]: SimpleSetup: Path simplification took 0.001465 seconds and changed from 4 to 2 states Traceback (most recent call last): File "/home/cyu/catkin_kinect2/src/easy_handeye/rqt_easy_handeye/src/rqt_easy_handeye/rqt_calibrationmovements.py", line 251, in handle_next_pose self.local_mover.execute_plan(plan) File "/home/cyu/catkin_kinect2/src/easy_handeye/rqt_easy_handeye/src/rqt_easy_handeye/rqt_calibrationmovements.py", line 110, in execute_plan raise RuntimeError("got crazy plan!") RuntimeError: got crazy plan! Thank you very much again

marcoesposito1988 commented 5 years ago

It looks like the starting pose is still not good. If you look at RViz, you should see that the MoveIt! planning produced a movement that was judged too large by the automatic movement plugin.

If this is not the case, this means that somehow your robot needs to move a lot to perform the small motion planned by the plugin. I am going to document how to adjust the threshold that is used to reject bad plans.

cygreat7 commented 5 years ago

It looks like the starting pose is still not good. If you look at RViz, you should see that the MoveIt! planning produced a movement that was judged too large by the automatic movement plugin.

If this is not the case, this means that somehow your robot needs to move a lot to perform the small motion planned by the plugin. I am going to document how to adjust the threshold that is used to reject bad plans.

It did produce an oversized movement, and there is no way to execute it in the GUI. How can I solve this problem? Thank you for your help.

marcoesposito1988 commented 5 years ago

Alright, I thought that the threshold were already a parameter.. instead, there is still a big TODO here

The solution to the problem depends on the movement.

If the movement would lead to a collision with the environment, then it is a matter of the robot's planning and I can't help you much. You can try to start from a different position. Or, if you can start the robot in a way that you can move it from its control panel while it keeps updating the position in ROS (e.g. you can see it moving in rviz), then you can calibrate without the automatic movements.

If the movement is only slightly beyond the threshold and you can't find a way to keep the movement within it, you can change the threshold by editing the code here. You should replace the right hand side with the values in degrees that you want the joints to move at maximum. However, in this case, try to set the robot to move as slowly as possible and keep the hands on the red button! If anything bad happens, it is your responsibility.

cygreat7 commented 5 years ago

Alright, I thought that the threshold were already a parameter.. instead, there is still a big TODO here

The solution to the problem depends on the movement.

If the movement would lead to a collision with the environment, then it is a matter of the robot's planning and I can't help you much. You can try to start from a different position. Or, if you can start the robot in a way that you can move it from its control panel while it keeps updating the position in ROS (e.g. you can see it moving in rviz), then you can calibrate without the automatic movements.

If the movement is only slightly beyond the threshold and you can't find a way to keep the movement within it, you can change the threshold by editing the code here. You should replace the right hand side with the values in degrees that you want the joints to move at maximum. However, in this case, try to set the robot to move as slowly as possible and keep the hands on the red button! If anything bad happens, it is your responsibility.

Thank you very much for your advice, I will try the method you said.

cygreat7 commented 5 years ago

Alright, I thought that the threshold were already a parameter.. instead, there is still a big TODO here

The solution to the problem depends on the movement.

If the movement would lead to a collision with the environment, then it is a matter of the robot's planning and I can't help you much. You can try to start from a different position. Or, if you can start the robot in a way that you can move it from its control panel while it keeps updating the position in ROS (e.g. you can see it moving in rviz), then you can calibrate without the automatic movements.

If the movement is only slightly beyond the threshold and you can't find a way to keep the movement within it, you can change the threshold by editing the code here. You should replace the right hand side with the values in degrees that you want the joints to move at maximum. However, in this case, try to set the robot to move as slowly as possible and keep the hands on the red button! If anything bad happens, it is your responsibility.

I am very sorry to bother you again, thank you for your advice. I checked my running results and I found that I couldn't get the marker's frame. 2 And the kinect frame I saw in rviz, I don’t think it is real. 1 Below is my launch file, I don't know if there is any problem. 2018-12-21 17 10 48 I have run the aruco_ros package separately.Can display kinect and marker frame under rviz 3 Thank you very much and sorry for the many questions! Best regards

marcoesposito1988 commented 5 years ago

The problem is the aruco reference_frame argument.

It is due to the fact that in tf you need to have a directed acyclic graph of transformations. via hand-eye calibration you can find the transformation from the hand of the robot to the optical center of the camera. however, the kinect driver already publishes the transform from the base of the kinect to the rgb camera (if I remember correctly). So you have to tell aruco to give you poses not in the coordinate system of the rgb camera, but of the base of the kinect.

So you should set the reference_frame launch arg of aruco to the tf frame id of the "root" frame published by the kinect driver. Then you have to use this as the tracking base frame for easy_handeye.

Right now you are asking aruco to give you the coordinates of the marker with respect to itself (the marker -> marker transform), so tf cannot even visualise it.

I have not done this in more than a year and I am writing this off the top of my head, I hope this is correct.

marcoesposito1988 commented 5 years ago

Closing for inactivity. If the problem persists, please comment and the issue will be reopened if appropriate.