graspnet / anygrasp_sdk

222 stars 22 forks source link

translation data seems to be wrong #32

Closed LunaceC closed 7 months ago

LunaceC commented 7 months ago

Hi! While the model generates satisfying grasp in open3d visualization, the translation term in the grasp data i got always seems to be wrong.

For instance, the model just generated [0.03345555 0.16258606 0.88210291] as the translation term of the highest score grasp. However as i measured(with a ruler), the grasp point should actually be at about [-0.255, -0.074, 0.685], which is definitely not caused by measurement error lol

However, the rotation matrix it generated is actually quite accurate, which some how tells that my image input seems to work.

Screenshot from 2023-11-23 21-03-17

Screenshot from 2023-11-23 21-02-55

While in open3d the grasp seems to adequately located, which really confused me

Do you have any idea on this? Any help would be appreciated!

LunaceC commented 7 months ago

Okay here's more weird stuff after more tests:

So after changing different test cases, I found that the grasp data generated for 640480 sized images were fully correct, while all 1280720 pixels images failed on the translation part. The rotation matrices are good for all test cases

Only added to my confusion but hope it helps with locating the problem XD

chenxi-wang commented 7 months ago

Did you change the intrinsic matrix accordingly when you switched the image size?

LunaceC commented 7 months ago

you're right!!! I've put the intrinsic adjustment in the hand eye calibration, so in my main loop it did not activate as I change the image size. Now the project works perfectly

Thanks a ton for the fast response!!