Open goyalvarun02 opened 2 years ago
Hi @goyalvarun02, the coordinate Transformation (TF) in ROS and Moveit uses Quaternion for orientation representation. There is function available in tf
module to convert RPY to Quaternion, refer here. Another way for calculation is use Axis-Angle way, for example your refered TCP orientation actually has RPY of [180, 0, 180] degrees, corresponding axis-angle is axis=[0,1,0] and angle = 180 degree (please search for this calculation online or use our axis-angle interface). The Quaternion will be calculated as w = cos(angle/2)
= 0, [x,y,z] = sin(angle/2)*axis
= 1 * [0,1,0] = [0,1,0].
Hi @penglongxiang
Thanks for the reply. I understood that there is function which can convert RPY to Quaternion. For that RPY must be known , but in code we do not know the RPY at runtime.
My query is for below code , How we come to conclusion that Quaternion for position 40mm above is this : (where tf is calculated on runtime ): next_target.orientation.x = 0; next_target.orientation.y = 1; next_target.orientation.z = 0; next_target.orientation.w = 0;
Thanks
Hi @goyalvarun02,
This orientation is a fixed one in our demo for simplicity, because we tested the grasping in 2D platform where the objects are placed on the table parallel to the robot base. In Real application you can make this orientation dynamic which is consistent with current TCP orientation, or others subject to the target object surface .
Hi @penglongxiang ,
I tried to use the same orientation of object returned by transform . I observed gripper going to the right place but the path was like that one of the link (link5) collides to table and tried to hit the table.
Any suggestion to make it correct?
I am afraid you have to make sure the orientation returned by object detection and its transformation to robot base is correct. Try some easy orientation and make hand calculations to verify that and find out which part gives the incorrect result.
Hi @penglongxiang
Do you have any sample or pointer, where I can see hand calculation . In my case, find object returns pose (x,y,z) and Quaternion (w,x,y,z) So I can use the hand calculation to validate these but how?
I can measure the x,y,z axis with measuring tape but how can I validate the value of orientation? :-(
Hi @penglongxiang
As per my understanding and user manual :
"The default TCP coordinate system is defined at the centre point of the end flange of the robotic arm, and it is the result of rotating [180°, 0°, 0°] around the X/Y/Z-axis of the base coordinate system in order." So why in your case assume RPY of [180, 0, 180] degrees. ?
In my case , I have placed the object and base of robot on the same table (object is parallel to robot). But when I passed value obtained from lookupTransform (position and orientation ) as it is to /xarm_straight_plan service, xArm7 Collides to table. please advise.
This orientation is a fixed one in our demo for simplicity, because we tested the grasping in 2D platform where the objects are placed on the table parallel to the robot base. How you have fixed the orientation? and when object is parallel to base so TCP orientation will remain same like you are using [180, 0, 180] and I used [180, 0, 0] (as mentioned in user manual ) .
Please suggest , its a long pending issue which i am trying to fix .
PFA pic of my setup .
Hi, as said in my previous answer, the fixed orientation is just an example. Any yaw value with roll= 180 and pitch=0 will make the gripper perpendicular to the base plane. Please make sure the calibration result is reasonable and object transformation from camera to robot coordinate are correct.
Would you at least give us some data. Like the position command given to /xarm_straight_plan service, and the calibration result of this configuration?
Hi @penglongxiang
Here are the calibration result (Eye on base):
parameters: eye_on_hand: false freehand_robot_movement: false move_group: xarm7 move_group_namespace: / namespace: /xarm_realsense_handeyecalibration_eye_on_base/ robot_base_frame: link_base robot_effector_frame: link_eef tracking_base_frame: camera_color_optical_frame tracking_marker_frame: camera_marker transformation: qw: 0.511854442092111 qx: -0.49520404139972257 qy: -0.5053937568269141 qz: 0.4871910693478825 x: 1.223116565148332 y: -0.2299467666922303 z: 0.16826298401788548
Here are the snapshots :
Going to start position
When Find object passed the pose with orientation going to pickup the object .
Error on the terminal (where roscore is running):
Error on terminal where application is running:
Please let me know if anything else require from my side.
it seems that you are giving the object orientation (same with robot base, with z pointing upwards) directly to be the TCP orientation (whose z should be pointing downwards). Please try giving [qw=0, qx=1, qy=0, qz=0] as target orientation, this corresponds to RPY=[180, 0, 0]
Hi @penglongxiang ,
When I give qw=0, qx=1, qy=0, qz=0] as target orientation. Gripper is successful to pickup the object as 2-D planar of object (only one surface is visible to camera) is identified by find object 2D and it returns the (X,Y,Z) coordinates of plane.
However, when I rotate the object to some angle (10 or 20 degree), then 2 adjacent surface is visible to camera and it gives the pose and orientation of that plane (which is quite different from previous case) , in this case also I am using qw=0, qx=1, qy=0, qz=0] as target orientation but when gripper comes to pick the object its one finger collides to corner of object or sometime middle of object .
Refer to this answer: https://github.com/xArm-Developer/xarm_ros/issues/121#issuecomment-1236520849
You may still need to calculate the correct orientation with Rotation matrix. First transfer the object quaternion orientation to corresponding Rotation matrix R, which consists of three unit column vectors representing its XYZ axis, that is R = [x, y, z] . Since your purpose is to make Z pointing upwards, you can just use R' = [x, -y, -z] as the TCP orientation and transfer it back to Quaternion representation.
This is by far the way I can think of, not sure if there is a simpler way since I am not an expert at Quaternions.
Hi @penglongxiang
I will try your suggestion as well .
I was thinking in the way if we can add constraint to moveit that should not go below the table as it tries to go below the table whenever I pass the orientation from find object 2D? or what exactly the root cause of this issue , I could not understand :-(
The root cause is just the incorrect TCP orientation. The obect and gripper orientations should not necessarily be the same, their coordinate axis could have opposite directions, just one more transfer will be enough. Please consult the developer of find_object_2d for the definition of detected object orientation.
Hi @penglongxiang How to fix the object orientation and with whom reference it need to fix. I know the gripper (TCP orientation in reference to base coordinates) coordinates.
After getting coordinates from transform when we want to move gripper above 40mm. I understand it will not be the same orientation from transform . Then how do you calculate the orientation x=0, y=1,z=0,w=0 ?