icub-tech-iit / xcub-moveit2

Collect the outcomes of our study on the use of MoveIT with our robots
BSD 3-Clause "New" or "Revised" License
3 stars 0 forks source link

Design a MoveIt2 demo for PI16 review and prepare for it #13

Closed martinaxgloria closed 10 months ago

martinaxgloria commented 11 months ago

Task description

It would be nice to have a MoveIt2 demo for the next PI16 review with simulated and/or real robot. In particular, we should understand which robot is available and what to show.

Definition of done

The moveit controller works on ergoCub in simulation, for moving then on the real robot.

pattacini commented 11 months ago

To be discussed w/ @maggia80.

maggia80 commented 11 months ago

@martinaxgloria good point. We should ask Davide De Tommaso and/or Francesco Rea. They are responsible for the iCubs @ Erzelli if there is any robot available. I would avoid the use of the old robots that still have the PC104.

Nicogene commented 11 months ago

In case we use a Erzelli's iCub we could schedule a visit there to install what we need on that setup, after agreeing w/ Rea/De Tommaso

martinaxgloria commented 11 months ago

About the use of iCub @ Erzelli, on icub-head they have a version of Ubuntu 20.04, but ros humble requires version 22.04. For this reason, we could think of moving the demo to ergocub. I will look at it starting from the next sprint.

cc @maggia80 @Nicogene @pattacini

martinaxgloria commented 11 months ago

As proposed in the previous comment, today I worked on the generalization of the ros2_control framework in order to be used both with iCub and ergoCub. In particular, I:

ergocub.webm

Probably there is a problem in ergocub urdf since it's clearly not stable, but on the moveit/ros side it seems to work everything

cc @pattacini @Nicogene

pattacini commented 11 months ago

Super!

The arm doesn't oscillate; only the body itself is wobbling. We may consider anchoring the waist with a link fixed to the ground for our experiments.

Nicogene commented 11 months ago

Super!

The arm doesn't oscillate; only the body itself is wobbling. We may consider anchoring the waist with a link fixed to the ground for our experiments.

You can use the fixed ord feet_fixed models if you don't want to make it walk. We noticed that it was not oscillating when it had the torso_pitch locked

martinaxgloria commented 11 months ago

Today, @Nicogene and I investigated the problem of the wobbling of ergoCubGazeboV1 model and we found out that it comes from https://github.com/icub-tech-iit/ergocub-software/commit/aae9cd689de40f900512fc1ad6a9c0c9b110d02b (I opened a issue for this problem, but it's out of this issue).

Moreover, regarding the MoveIt2 demo, we could anyway switch to the feet_fixed or fixed model as proposed by @Nicogene

martinaxgloria commented 11 months ago

After this fix https://github.com/icub-tech-iit/ergocub-software/pull/176, the contacts on the soles are more stable and the robot is not wobbling anymore. For the time being, I'd stay with the standard ergocubGazeboV1 model or, if we want to move to the other configuration, I have to understand how to handle the .sdf file within the ros2 environment

pattacini commented 11 months ago

For the time being, I'd stay with the standard ergocubGazeboV1

Ok!

martinaxgloria commented 11 months ago

Yesterday I started playing with MoveIt Task Constructor to understand if it could be a useful tool for our demo.

In particular, it uses the poses of a group of joints defined inside the .srdf file and breaks down a complex task in a set of steps. So, one con is that the poses to be reached have to be previously defined, but if we are not planning to do really complex movements, it would not be a problem

cc @Nicogene @pattacini

What is your opinion about?

pattacini commented 11 months ago

We're not going to use the hands, just the torso and the arm. With the latter kinematic chain, it'd be nice to showcase some choreography:

If the tool above can be helpful for this, let's go with it, otherwise it's mere overkill.

During the demo, I'd expect the following questions to be raised (appetite comes with eating):

Just let's gear up in view of them by preparing a sensible answer in prospect.

martinaxgloria commented 11 months ago

Thanks for the tips @pattacini!

Ok, so I'm going to think about that and understand which could be the best solution.

Just let's gear up in view of them by preparing a sensible answer in prospect.

Sure, probably the hands question would be the most popular one

pattacini commented 11 months ago

Moving the hand both in position and orientation mimicking a reaching task for grasping an object (although we won't use the hand).

You may find the grasping sandbox relevant to this. You could record the reaching trajectory of the end-effector's pose in the Cartesian space and replicate it[^1] with MoveIt2.

[^1]: After a suitable rescaling between iCub and ergoCub.

martinaxgloria commented 11 months ago

A few updates on this activity.

In the end, I gave up on MoveIt Task Constructor since it was not so essential in our case and I tried to plan a circular trajectory by defining a set of points and computing the cartesian path that follows them. Here the result:

circular_path.webm

cc @pattacini @Nicogene

pattacini commented 11 months ago

Super nice! I guess you got inspired by our tutorial ๐Ÿ˜„

In this case, the process of designing the trajectory in MoveIT2 is also important to be showcased, I'd say.

Perhaps, we can make the circle wider and we can control the orientation of the hand as well to be with the palm always down.

martinaxgloria commented 11 months ago

Hi @pattacini,

I tried with a wider circular path and with the palm down as you suggested in the previous comment and the result was the following:

incomplete_path.webm

As you can see, the solver wasn't able to retrieve the complete trajectory and trying to print the percentage of the computed trajectory, I obtained:

[robot_moveit-2] [INFO] [1695909393.811120763] [move_group_demo]: Cartesian path: 46.94% achieved

After the f2f chat we had, I checked the TRAC-IK parameters trying to tuning them to obtain the result we aim to but the only implemented parameters to be set are kinematics_solver_timeout and position_only_ik. Changing the timeout value, nothing changes, while with the latter set to true (the default value is false), we have no more control over the orientation of the hand, so it's not suitable with our case.

Moreover, I realized that in the .srdf file, there's a check of the collisions between adjacent links and during the trajectory planning stage, I set a parameter to avoid collisions to true. Just to test, I disabled those collisions forcing the kinematics to be solved and the result was:

forced_complete_path.webm

In the video, it's possible to see that the wrist became red, which means that it came in collision with some other parts (from the visualization I'm not able to see where or which links are in contact). This is why the solver didn't retrieve the 100% of the execution.

pattacini commented 11 months ago

Great debugging @martinaxgloria ๐Ÿš€

In the video, it's possible to see that the wrist became red, which means that it came in collision with some other parts (from the visualization I'm not able to see where or which links are in contact). This is why the solver didn't retrieve the 100% of the execution.

Can you check that the collisions among parts take place even if the joints remain within their bounds? If this is the case, then it's not a real problem for us and we can keep the collision flag disabled.

martinaxgloria commented 11 months ago

Can you check that the collisions among parts take place even if the joints remain within their bounds? If this is the case, then it's not a real problem for us and we can keep the collision flag disabled.

I don't know if I got the question, but I checked the joints' positions from the yarpmotogui during the entire movement and they remained within the bounds.

image

@pattacini

pattacini commented 11 months ago

Correct ๐Ÿ‘๐Ÿป However, I'd rather favor a direct check via data acquisition to be 100% sure.

martinaxgloria commented 11 months ago

Today I worked on this activity. In particular, I scaled the trajectory already implemented to work with iCub, which has the root_link reference frame with the x and y axis opposite oriented with respect to ergoCub, see below:

ergocub_root_link

icub_root_link

After that, I tried to include a reaching-like trajectory and in the end I obtained this:

demo_first_attempt.webm

Obviously, this is not the final result, but an attempt to resize the movements (especially for the final part), and then I still have to log the data in order to see if the joints remain within their bounds during the execution.

Moreover, this afternoon together with @Nicogene, we tried to install ros_humble on iCubGenova11 head. Firstly, we tried to install ros-humble-desktop within a conda environment alongside yarp, but we had some problems since the last version of yarp-devices-ros2 depends on yarp v3.8.1 and trying to install this version, we retrieved those errors. Then, we thought that a possible solution could be the use of ros foxy already installed on icub-head with apt dependencies: the only purpose is to launch the yarprobotinterface that exposes ros2 nws and publishes on ros2 topics and then try to read and attach to those topics with my laptop connected to the same network. It should be possible by using a cyclone dds configuration file in which you select a closed number of ip that see the topics. The only problem is that both the machines should have the same ros distro, so it's not our case. So this activity is still a wip.

cc @pattacini @Nicogene

martinaxgloria commented 10 months ago

Today, @Nicogene and I tried to find a solution to have ros2 installed on icub-head in order to run the demo. Firstly, I installed within the conda environment yarp v3.7.2 and then I tried to compile yarp-devices-ros2 by manually changing those lines. The workaround worked finely but I wasn't able to launch the yarprobotinterface (both with and without the ros2 nws exposed) since some devices were compiled from source and other inside the conda env. For example:

[ERROR] yarprobotinterface intercepted a segmentation fault caused by a faulty plugin:
[ERROR] /usr/local/src/robot/robotology-superbuild/build/install/lib/iCub/embObjMotionControl.so(_ZThn688_N4yarp3dev19embObjMotionControl7getAxesEPi+0xb) [0x7f391c07b59b]
Trace requested at /home/conda/feedstock_root/build_artifacts/yarp_1674563103456/work/src/yarprobotinterface/Module.cpp:77 by code called from:
/home/icub/mambaforge/envs/ros_env/bin/../lib/libYARP_os.so.3(_Z16yarp_print_traceP8_IO_FILEPKcj+0x32) [0x7f39225dcb42]
yarprobotinterface(+0xa13a) [0x56113e35413a]
/lib/x86_64-linux-gnu/libc.so.6(+0x46210) [0x7f392216d210]
/usr/local/src/robot/robotology-superbuild/build/install/lib/iCub/embObjMotionControl.so(_ZThn688_N4yarp3dev19embObjMotionControl7getAxesEPi+0xb) [0x7f391c07b59b]
/usr/local/src/robot/robotology-superbuild/build/install/lib/yarp/yarp_controlboardremapper.so(+0x42de4) [0x7f391c240de4]
/usr/local/src/robot/robotology-superbuild/build/install/lib/yarp/yarp_controlboardremapper.so(+0x45108) [0x7f391c243108]
/usr/local/src/robot/robotology-superbuild/build/install/lib/yarp/yarp_controlboardremapper.so(+0x452c8) [0x7f391c2432c8]
/home/icub/mambaforge/envs/ros_env/bin/../lib/libYARP_robotinterface.so.3(_ZNK4yarp14robotinterface6Device6attachERKNS_3dev14PolyDriverListE+0x91) [0x7f3922746d11]
/home/icub/mambaforge/envs/ros_env/bin/../lib/libYARP_robotinterface.so.3(_ZN4yarp14robotinterface5Robot7Private6attachERKNS0_6DeviceERKSt6vectorINS0_5ParamESaIS7_EE+0x4e2) [0x7f3922754312]
/home/icub/mambaforge/envs/ros_env/bin/../lib/libYARP_robotinterface.so.3(_ZN4yarp14robotinterface5Robot10enterPhaseENS0_11ActionPhaseE+0x1a25) [0x7f39227580e5]
yarprobotinterface(+0xc1f1) [0x56113e3561f1]
/home/icub/mambaforge/envs/ros_env/bin/../lib/libYARP_os.so.3(_ZN4yarp2os8RFModule9runModuleERNS0_14ResourceFinderE+0xef) [0x7f392263950f]
yarprobotinterface(+0x90b9) [0x56113e3530b9]
/lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf3) [0x7f392214e0b3]
yarprobotinterface(+0x9171) [0x56113e353171]
Illegal instruction (core dumped)

this was the error I obtained. After this attempt, I had a f2f chat with @Nicogene and we decided that since the iCubGenova11 setup is shared with the RL, it should be better not to change the environment variables and/or to force the environment paths to link to ros2 resources.

For this purpose, we decided to try to install ros-foxy and compile the repo with the moveit demo on the iCubGenova11 laptop and use the head (in which this distro was already installed with apt dependencies) to launch the yarprobotinterface and expose ros2 topics. I started this activity, but I had to change some methods and attributes in my code since they differs from ros-foxy to ros-humble distros. Tomorrow I'll test the changes with this new configuration and I'll let you know

cc @pattacini

martinaxgloria commented 10 months ago

Some updates: yesterday, with the help of @Nicogene, I tried to solve some problems related to this activity. First of all, we wrote a custom โ€œall joints remapperโ€ in which we excluded the joints that are not present at urdf level (i.e. eyes and fingers): this was because on the /joint_state topic they were all published in terms of position, velocity and effort, but when the robot_state_publisher node tried to publish of the /tf topic the poses of the joints it read from the urdf model, it didnโ€™t find a perfect match and retrieved an error.

After solving this issue, we successfully visualized the model on rviz2, with the poses properly published on the /tf topic by the robot_state_publisher node. But other problems with other nodes (in particular with the ros2_control_node) arise. We tried to patch them but in the end, after asking HSP people, we decided to try to update to Ubuntu 22.04 the icub-head OS, and to install ros_humble to have compatibility with what I have done so far in simulation.

cc @pattacini

martinaxgloria commented 10 months ago

After the upgrade of icub-head OS to Ubuntu 22.04 and the installation of ros-humble on that machine, me and @Nicogene succeded in make my laptop and icub-head communicating with each other by passing a configuration file to the cyclone dds in which the ip addresses of the two machines are specified as a closed network.

Today I tried to use ros2_control on iCubGenova11 and this was the result both in simulation and on the real robot:

demo_attempt.webm

https://github.com/icub-tech-iit/study-moveit/assets/114698424/54c16b6e-39c3-4e19-92c5-73bf4836493d

Now, I have to refine the trajectories and improve something (i.e. the possibility of coming back to the starting position after all the movements in order to restart the demo without homing all the joints from the yarpmotorgui).

cc @Nicogene @pattacini

pattacini commented 10 months ago

Voilร ! Superb! ๐Ÿš€

maggia80 commented 10 months ago

Amazing!!!

martinaxgloria commented 10 months ago

Hi @pattacini, as you suggested, I tried to implement the reaching task. I started with icub-gazebo-sandbox from which I logged some poses during the reaching trajectory (side grasping of the mustard bottle with the right hand) and I gave them as waypoints to the kinematics solver to compute the cartesian trajectory with the robot's right hand as end-effector. This is what I obtained:

https://github.com/icub-tech-iit/study-moveit/assets/114698424/bdc25e5c-1fc8-4ab4-8160-f5922d0c24d7

Maybe it's just me, but it seems different with respect to the one seen in the sandbox. Do I have to rescale it? Or maybe do some other acquisitions and try to replicate them within the MoveIt environment?

cc @Nicogene

pattacini commented 10 months ago

Update F2F. @martinaxgloria could you quickly summarize the action points?

martinaxgloria commented 10 months ago

After a f2f chat with @pattacini, we came up with the conclusion that:

martinaxgloria commented 10 months ago

With the help of @Nicogene, we tried to set the correct end effector orientation for the grasping task. Starting from the icub-grasping-sandbox, we logged the grasping pose the hand has to reach and we highlighted its frame in gazebo:

Screenshot from 2023-10-17 16-23-35

Then, since it is in axis-angle convention and MoveIt works with quaternion, we use tf2::Quaternion() axis angle constructor to obtain the corresponding quaternion. We tried to move the eef to that pose but it wasn't able to reach it. We visualized in rviz the pose we set and we obtained the circled tf:

msg566877121-767969

If you compare this last frame with the reference one on gazebo, you can see that they are very different. It seems that we missed a transformation between yarp and ros2 conventions.

cc @pattacini @traversaro

So far, we have two movements that we can show (hand down and circle trajectory), and we can show how it's possible to set poses and compute the cartesian trajectory also from the GUI.

Nicogene commented 10 months ago

We know that the pose retrieved from our pipeline is referred to the root_link, we have to see if the pose we are giving to move it is referred to the world frame. In that case premultiplying by root_link_H_world should be sufficient

cc @traversaro

pattacini commented 10 months ago

Nonetheless, the reference orientations we used to command the circular path seemed to be ok. Do I miss something in this respect?

martinaxgloria commented 10 months ago

Nonetheless, the reference orientations we used to command the circular path seemed to be ok.

Do I miss something in this respect?

It's ok because I manually computed the transformation between the hand and the root link in rpy angles and then I transformed them into a quaternion to get the final pose (which is the hand oriented downward).

pattacini commented 10 months ago

BTW, the reference frame attached to the wrist visible in the figure above is not the one used for reaching. The correct one is documented.

pattacini commented 10 months ago

https://icub-tech-iit.github.io/documentation/icub_kinematics/icub-forward-kinematics/icub-forward-kinematics-arms/

martinaxgloria commented 10 months ago

Importing the iCubGazeboV2_5 model (both with and without hands) on gazebo, I noticed that the hand reference frame located on the palm is not visible from the urdf:

Screenshot from 2023-10-18 09-01-16

cc @Nicogene

traversaro commented 10 months ago

Sorry, I am not sure I got the problem from the issue, can we have a quick chat on it?

martinaxgloria commented 10 months ago

image

So far, I worked with link frame n.9 in figure, but the reference one for the hand is the n.10, which has a different orientation with respect to the former. This is why I obtained and tried to reach an unreachable pose

pattacini commented 10 months ago

The most important point is that we have to find a way to express within the URDF the standard End-Effector used in reaching as defined in https://icub-tech-iit.github.io/documentation/icub_kinematics/icub-forward-kinematics/icub-forward-kinematics-arms/.

As of now, it seems that the last frame available from URDF is the one in https://github.com/icub-tech-iit/study-moveit/issues/13#issuecomment-1767820259, which is attached to the wrist though and not to the palm.

traversaro commented 10 months ago

The most important point is that we have to find a way to express within the URDF the standard End-Effector used in reaching as defined in https://icub-tech-iit.github.io/documentation/icub_kinematics/icub-forward-kinematics/icub-forward-kinematics-arms/.

That frames are l_hand_dh_frame and r_hand_dh_frame, see https://icub-tech-iit.github.io/documentation/icub_kinematics/icub-model-naming-conventions/icub-model-naming-conventions/#frames .

traversaro commented 10 months ago

Importing the iCubGazeboV2_5 model (both with and without hands) on gazebo, I noticed that the hand reference frame located on the palm is not visible from the urdf:

Gazebo does lump the fixed joint of model, so all the "additional frames" present in the URDF model are ignored and only the frame of actual links with mass are displayed, so it is not a good way of visualizing all the frames of the model. To visualize all the frames of the models, you can either use RViz or (harder solution) write a custom visualizer combining iDynTree's Visualizer API and KinDynComputations, see for example (for the visualizer API) https://github.com/ami-iit/yarp-device-openxrheadset/blob/c81176c3a5b535876f611ab490e8bbb09f0ffe64/src/utils/OpenXrFrameViz/main.cpp#L165 .

martinaxgloria commented 10 months ago

Thank you @traversaro. I noticed that the l_hand_dh_frame and r_hand_dh_frame were not included in the configuration files generated with the moveit setup assistant, that's why I wasn't able to see and use them from MoveIt side. I'm going to add this frame and do again the tests.

martinaxgloria commented 10 months ago

After fixing the end effector reference frame from r_hand to r_hand_dh_frame, I tried again to run the simulation and this is what I obtained:

Screenshot from 2023-10-18 11-34-10

The bold rf is the hand's one, while the other is the reference pose we want to reach. As you can see, now the two are oriented in the same way (so the hand is reaching the desired orientation) but this is still the wrong one with respect to the one seen in the sandbox (and from which I logged it).

With @Nicogene, we made a comparison between the r_hand_dh_frame of iCubGazeboV2_5_visuomanip used in the sandbox:

Screenshot from 2023-10-18 15-19-33

and the one of iCubGazeboV2_5 without hands I am currently using:

Screenshot from 2023-10-18 15-19-33

and they are the same.

Moreover, in order to see which is the reference frame for MoveIt, we logged the pose reference frame:

[robot_moveit-5] [INFO] [1697633637.634155934] [move_group_demo]: Pose reference frame: root_link
[robot_moveit-5] [INFO] [1697633637.634157643] [move_group_demo]: End effector link: r_hand_dh_frame

so MoveIt is aligned with our pipeline.

cc @pattacini @traversaro

martinaxgloria commented 10 months ago

Today I did some more tests in order to understand if the conversion between the axis-angles retrieved by yarp and the quaternion transformation done inside the ros2 environment was correct (I gave MoveIt an RPY rotation and I transformed it into a quaternion):

RPY (x,y,z) Quaternion (x,y,z,w) Visual Result
(0,0,0) (0, 0, 0, 1) Screenshot from 2023-10-19 14-51-12 final hand pose (not reached) = root_link pose
(0,-ฯ€,0) (0, -1, 0, 0) Screenshot from 2023-10-19 10-59-19 hand in downward pose
(ฯ€/2,-ฯ€,0) (0, -0.707107, 0.707107, 0) Screenshot from 2023-10-19 10-58-37 hand in side grasping pose
(ฯ€,-ฯ€,0) (0, -0, 1, 0) Screenshot from 2023-10-19 10-57-29 hand in upward pose

It seems to be all coherent with the rpy rotation I imposed. For this reason, I tried again to launch the icub grasping sandbox simulation and I logged the poses for the pre-grasp and grasp phases in a condition of side grasping (like the one I used from the beginning). I gave those poses to MoveIt and I launched the simulation first, and then the demo on the real robot:

Screencast from 10-19-2023 01:56:41 PM.webm

https://github.com/icub-tech-iit/study-moveit/assets/114698424/b0ac19f2-376d-47d1-a532-c8f42d3715c5

Now the poses are consistent with the ideal movement! Probably, I was supposed to log those values more than once (like with icub-grasp.sh test to be sure)

cc @Nicogene @pattacini

pattacini commented 10 months ago

It's not clear to me what made everything work now, but cool that you managed to sort it out ๐Ÿš€ We may try to sample a reaching trajectory and give it back to MoveIt2, if you like.

martinaxgloria commented 10 months ago

The poses I gave this time were not the same as before, i don't know why

We may try to sample a reaching trajectory and give it back to MoveIt2, if you like.

Did you mean by sampling the poses during the entire movement?

pattacini commented 10 months ago

Did you mean by sampling the poses during the entire movement?

Yep ๐Ÿ‘๐Ÿป

To present this, we can show a video of the sandbox doing the movement (using the Cartesian Control) and then we can run it on the robot using MoveIt2.

The robot will be doing slightly different movements (because of different IK's and problem formulations). Also, the robot won't gaze. This is ok, but perhaps we could close the hand in open-loop to signal that we reached the endpoint.

martinaxgloria commented 10 months ago

Hi @pattacini,

I implemented the closing of the hand during the grasping-like task and this is the entire demo with the hand-down, the circle trajectory and the reaching (with only 3 poses given to the solver):

https://github.com/icub-tech-iit/study-moveit/assets/114698424/3d3e756d-cbe2-4d28-b9ee-bbaa8153282a

As you can see, the torso is really involved in the movement, but we knew this "problem". What do you think about the tasks now? cc @Nicogene

pattacini commented 10 months ago

Superb! ๐Ÿš€

Nicogene commented 10 months ago

Awsome work indeed! ๐ŸŽ–๏ธ