icub-tech-iit / xcub-moveit2

Collect the outcomes of our study on the use of MoveIT with our robots
BSD 3-Clause "New" or "Revised" License
3 stars 0 forks source link

Make test on the performance of the controller #18

Closed Nicogene closed 9 months ago

Nicogene commented 10 months ago

Once done we could write a report to be added to the repo

pattacini commented 9 months ago

It would be nice to come up with a sort of workspace estimation, maybe just for a limited portion of the reaching space of the robot.

In the far past, we did something similar for the iCub: https://github.com/robotology/icub-workspace-estimation.

I'm not saying that we should do the same thing, but we could visualize a similar narrower estimation.

https://github.com/icub-tech-iit/study-moveit/assets/3738070/f87fe85c-fd8f-4931-b27d-d3756ca59255

See https://github.com/robotology/icub-workspace-estimation/tree/master/examples.

martinaxgloria commented 9 months ago

I started working on this activity. This is a WIP.

I started developing a "test" for understand which pose are reachable with the planning group torso + right_arm. I used getRandomPose method provided by moveit::planning_interface::MoveGroup class to generate valid random pose (position + orientation) for the end effector (i.e. right hand). So far, I tried with 10 pose and this is what I obtained:

Screencast from 11-08-2023 03:39:22 PM.webm

In this video, the 100% of the trajectory is reached 3 times over 10, instead the other poses are not completely reachable due to some collisions that are detected (I avoided the collisions between links when computing the Cartesian trajectory). I also printed out the desired pose and the current pose, so it's possible to compute the error between them. I'll keep working on it

cc @Nicogene @pattacini

martinaxgloria commented 9 months ago

Today I repeated the test with 100 poses instead of 10 with a result of ~23/100 poses reached. I saved the results and I tried to plot them with a scatter3 on matlab and I obtained:

scatter

Of course, they are not already enough, but it was only an attempt to see the result. I'm planning to collect something like 1000 poses.

I noticed that when from a non-reached pose I was trying to come back to the home position with a cartesian trajectory again, sometimes the controller was not able to compute the inverse movement, so the controller got stuck. In that sense, from this point on, the trajectory generation was influenced and it returned a lot of failures. To prevent this situation, after the random valid pose for the end effector is reached, the coming back movement is set in the joint space, not in the cartesian one, in order to start each trajectory generation from the same starting position of the kinematic chain.

pattacini commented 9 months ago

How are these random pose generated? 23/100 seems a quite low success rate.

noticed that when from a non-reached pose I was trying to come back to the home position with a cartesian trajectory again, sometimes the controller was not able to compute the inverse movement, so the controller got stuck. In that sense, from this point on, the trajectory generation was influenced and it returned a lot of failures. To prevent this situation, after the random valid pose for the end effector is reached, the coming back movement is set in the joint space, not in the cartesian one, in order to start each trajectory generation from the same starting position of the kinematic chain.

This is a potential problem that we should solve on our end. We ought to be able to reinstate the Cartesian control irrespective of the previous result. Maybe, an initialization problem of the solver?

martinaxgloria commented 9 months ago

How are these random pose generated? 23/100 seems a quite low success rate.

As said in the previous comment, I used the getRandomPose method provided by MoveIt. It generates some random poses but valid ones, so they should be solved without any problem. Maybe they are not due to some detected collisions between adjacent links during the execution (so far, the collisions are avoided, so the IK resolution fails when one is detected)

Maybe, an initialization problem of the solver?

I could investigate. I'm going to read some documentation and the paper about TRAC-IK implementation.

martinaxgloria commented 9 months ago

Some updates on this activity:

cc @Nicogene @pattacini

martinaxgloria commented 9 months ago

Doing more tests for the collision issue, I noticed that with iCub model, the links that went into collision more frequently are the pairs chest - root_link and r_forearm - r_hand.

Screenshot from 2023-11-13 16-16-51

I checked the self-collision matrix from the moveit setup assistant and those pairs have collisions check enabled as expected since they're not adjacent and they can go in contact. Screenshot from 2023-11-13 16-18-15 Screenshot from 2023-11-13 16-18-31

Those collisions are probably due to covers, but they have to be taken into account when running the test on the real robot, so I think that they have not to be disabled. Moreover, I did a double-check with ergocub model to understand if these collisions were detected in that case too, and so it is: Screenshot from 2023-11-13 17-12-45

In this case, the colliding links are r_hand_palm - r_wrist_1, not adjacent in the kinematic chain and with self-collision check enabled.

Instead of using getRandomPose method, I tried sampling the space as did in icub-workspace-estimation repo, but carefully reading the code I wasn't able to understand how to sample also the orientation, either than the position.

cc @Nicogene @pattacini

pattacini commented 9 months ago

Nice investigation @martinaxgloria 👍🏻

At any rate, I think that we can make things simpler. Let's do the following:

martinaxgloria commented 9 months ago

Today, @Nicogene and I made some visual analysis about the collisions retrieved during the tests. In particular, starting from the statement that during workspace sampling a collision between r_hand and r_forearm is often retrieved, we moved the r_wrist_pitch to its upper limit from the yarpmotorgui using iCubGazeboV2_5 (no hands) model to obtain the collision:

image

As you can see from the image above, the collision is due to contact between the forearm cover and the hand mesh: in this particular model, the hand shrinkwrap is really different from the real one, since we don't have the fingers and also the cover is modeled with some other pieces that are not present on the real robot like the one on the wrist highlighted in red in the figure that comes into contact. For this reason, we stated that we could disable the self-collision check between the two links mentioned above. After doing that, we could sample again the workspace, but this time with collision check enabled during cartesian trajectory generation and see the results.

cc @pattacini

pattacini commented 9 months ago

Orientation error in axis-angle notation, from the Sciavicco-Siciliano:

image

cc @martinaxgloria @Nicogene

martinaxgloria commented 9 months ago

After a f2f alignment with @pattacini and @Nicogene, we decided to go on without collisions check for the time being to reduce the number of variables that could cause failure. Moreover, we decided to investigate whether there's a correspondence (or even a dependency) between the pose error (difference between the wanted pose to be reached and the one actually reached) and the computeCartesianPath returning value (as said in the API documentation, this function "return a value that is between 0.0 and 1.0 indicating the fraction of the path achieved as described by the waypoints. Return -1.0 in case of error. ").

Today I'm going to collect this data and make the analysis. cc @Nicogene @pattacini

martinaxgloria commented 9 months ago

From a first preliminary analysis (thanks @pattacini for the help), it turned out that the percentage indicated the fraction of the path achieved is somehow related with the position error between the ideal and the real current pose.

MicrosoftTeams-image

However, we obtained that for quite high success percentage (~80%), the error in position is equally quite high (~4 cm). For this reason, in accordance also with @Nicogene, we decided to give up with this fractions and rely only on the errors in both position and orientation.

martinaxgloria commented 9 months ago

The data analysis regarding only the position error resulted in something like this:

position_error

The two plots represent the workspace in front of iCub sampled with a spatial step of 5 cm for each direction, one with the eef (i.e. right hand) oriented downward, the other with the eef oriented leftward (a reaching-like orientation). The colormap indicates the error between the desired and the current positions.

Regarding the orientation, instead, this afternoon I read the literature on how computing the error angle with quaternions. Tomorrow I'm going to add other plots.

cc @Nicogene @pattacini

martinaxgloria commented 9 months ago

Some updates on that. During the last few days, I collected other data to have a more detailed analysis of the performance of this controller. In particular, starting from this configuration file with TRAC-IK parameters description, I tried to sample the workspace (different positions, but orientation of the right hand always the same, i.e. leftward) changing the value of epsilon, which default value is 1e-5. The resulting plots showed very little difference when this parameter is set to 1e-3, 1e-4 and 1e-5, but with 1e-6 it resulted in very high error in both position and orientation.

In this sense, I decided to stay on epsilon = 1e-4 and this time I changed the parameter called solve_type, which could be set to Distance (by default), Speed, Manipulation1 or Manipulation2 (here the documentation about this). I obtained the worst results with Speed, while with Distance I often resulted in:

[ERROR] |gazebo-yarp-plugins.plugins.GazeboYarpControlBoard| An hardware fault occurred on joint  0  torque too big! (  2047.52  )

For this reason, I increased the values of max_torques to very high unrealistic values (like 100000.0) and now I'm repeating the acquisition.

I'll add more updates after this test.

cc @Nicogene @pattacini

martinaxgloria commented 9 months ago

After the workaround of max_torques, the best solve_type is Distance both in terms of position and orientation. For this reason, I will take this configuration (solve_type = Distance and epsilon = 1e-4) as the best one and I'll post here the final plots for both the orientations of the right hand chosen (leftward and downward as per this comment)

cc @Nicogene @pattacini

martinaxgloria commented 9 months ago

Here you are the final plots. At the end, I plotted the position and orientation estimation for both downward and leftward orientation of the right hand with TRAC-IK parameters tuned as per the previous comment.

final_plot

Please, let me know if they are ok so that I could start writing the report about this activity to be included in this repo.

cc @Nicogene @pattacini

pattacini commented 9 months ago

Fine with me! Well done 🚀

martinaxgloria commented 9 months ago

Thanks @pattacini! If you want, you can find a first attempt to write down the report here. I created a new ros2 package inside the branch feat/test_controller with the code and the report. I'm going to add also the matlab snippets I used to generate the plots. Any comment/suggestion is welcome

cc @Nicogene

Nicogene commented 9 months ago

Fine also for me! It would be super to have the report ready for the tomorrow review 💪🏻

martinaxgloria commented 9 months ago

I think the most is done, maybe something should be adjusted/added. I'll do my best

martinaxgloria commented 9 months ago

In the meantime I read again the report if something is missing in my opinion, I opened a PR with everything about these tests, including the report:

cc @Nicogene @pattacini

pattacini commented 9 months ago

Thanks @pattacini! If you want, you can find a first attempt to write down the report here. I created a new ros2 package inside the branch feat/test_controller with the code and the report. I'm going to add also the matlab snippets I used to generate the plots. Any comment/suggestion is welcome

cc @Nicogene

Superb!