lardemua / atom

Calibration tools for multi-sensor, multi-modal robotic systems
GNU General Public License v3.0
248 stars 27 forks source link

Implement Ali's RWHE calibration method #939

Open Kazadhum opened 4 months ago

Kazadhum commented 4 months ago

The idea is to implement the RWHE-Calib method for Hand-Eye calibration in ATOM.

Similarly to OpenCV, they have two different methods:

Am I correct in assuming we only want the second, @miguelriemoliveira? I remember we spoke of this on Monday because of the OpenCV calibration and if I recall correctly I think we landed on just using RWHE.

By the way, ATOM has a script called "convert_from_rwhe_dataset.py". Are these scripts referring to this method?

miguelriemoliveira commented 4 months ago

Am I correct in assuming we only want the second, @miguelriemoliveira? I remember we spoke of this on Monday because of the OpenCV calibration and if I recall correctly I think we landed on just using RWHE.

I think the hand eye should solve both variants, eye-in-hand and eye-to-world. In any case you can use Robot-World/Hand-Eye, I think its better.

I have been fighting with the opencv's method and still could not make it work.

By the way, ATOM has a script called "convert_from_rwhe_dataset.py". Are these scripts referring to this method?

Yes, we used Ali's method, but in the matlab scritp.

Kazadhum commented 4 months ago

Hi @miguelriemoliveira!

I think the hand eye should solve both variants, eye-in-hand and eye-to-world. In any case you can use Robot-World/Hand-Eye, I think its better.

Ok, that sounds good.

I have been fighting with the opencv's method and still could not make it work.

If you want, I could join you in Zoom after lunch to try to help. Have you made any progress or is it how we left it?

Yes, we used Ali's method, but in the matlab scritp.

Got it! These scripts will help a lot, I think.

miguelriemoliveira commented 4 months ago

If you want, I could join you in Zoom after lunch to try to help. Have you made any progress or is it how we left it?

I changed a lot, trying to cleanup the code. Dit not help.

I will call if I have some time this afternoon.

Kazadhum commented 4 months ago

Ok, sounds good!

Kazadhum commented 4 months ago

Picking this back up...

Kazadhum commented 4 months ago

Possible progress! The Li method for calibration in the code base is translated to Python (hopefully correctly!). Now I need to get the comparison working so I can test if it's working properly. I can't say I understand the code exactly, but I think the translation to Python is correct.

miguelriemoliveira commented 4 months ago

That's good, but running is the real deal.

Looking forward to see if it runs correctly.

Kazadhum commented 4 months ago

Hello all! I think I closed this by accident this morning...

Some notes:

For the eye-in-hand case, running:

rosrun atom_evaluation li_eye_in_hand.py -c rgb_hand -p pattern_1 -bl base_link -hl flange -json $ATOM_DATASETS/rihbot/train_test_opencv/dataset.json -ctgt

we get:

After filtering, will use 5 collections: ['000', '001', '002', '003', '004']
Selected collection key is 000
Calculating A and B matrices for collection 000...
Calculating A and B matrices for collection 001...
Calculating A and B matrices for collection 002...
Calculating A and B matrices for collection 003...
Calculating A and B matrices for collection 004...
Ground Truth h_T_c=
[[ 0.00000000e+00  1.11022302e-16  1.00000000e+00 -2.00000000e-02]
 [-1.00000000e+00 -2.22044605e-16  1.11022302e-16  0.00000000e+00]
 [ 0.00000000e+00 -1.00000000e+00  0.00000000e+00  6.50000000e-02]
 [ 0.00000000e+00  0.00000000e+00  0.00000000e+00  1.00000000e+00]]
estimated h_T_c=
[[ 1.49042521e-04 -3.88510070e-04  9.99999913e-01 -1.96416458e-02]
 [-9.99999898e-01 -4.25657727e-04  1.48877146e-04 -9.81041564e-04]
 [ 4.25599850e-04 -9.99999834e-01 -3.88573471e-04  6.44773915e-02]
 [ 0.00000000e+00  0.00000000e+00  0.00000000e+00  1.00000000e+00]]
Etrans = 0.621 (mm)
Erot = 0.018 (deg)
+----------------------+-------------+---------+----------+-------------+------------+
|      Transform       | Description | Et0 [m] |  Et [m]  | Rrot0 [rad] | Erot [rad] |
+----------------------+-------------+---------+----------+-------------+------------+
| flange-rgb_hand_link |   rgb_hand  |   0.0   | 0.000382 |     0.0     |  0.000321  |
+----------------------+-------------+---------+----------+-------------+------------+

Now creating a copy of this script for the eye-to-hand case and adapting it accordingly...

Kazadhum commented 4 months ago

Working on implementing Shah's method, also from that repo. Since it is similar to Li's, the implementation should be relatively simple.

Kazadhum commented 3 months ago

Bug fixing on Li's method after testing on the (simulated) riwmpbot revealed issues