PKU-EPIC / DexGraspNet

165 stars 19 forks source link

validate_grasps #22

Open jianguo-cmd opened 1 month ago

jianguo-cmd commented 1 month ago

20240718-102705 20240718-102721 20240718-102725 20240718-102730 I ran validate_grasps.py to validate grasp results. Some of the validation results show successful grasps, but the visualized grasping poses do not successfully grasp the object. Do I need to adjust the grasp success threshold? Looking forward to your reply. Thank you very much! @mzhmxzh @wrc042

wrc042 commented 1 month ago

It seems strange. I have finetuned default parameters, so you do not need to adjust them. For the false positive cases, I guess they are corner cases in which the target pose drives the hand to penetrate the object.

jianguo-cmd commented 1 month ago

It seems strange. I have finetuned default parameters, so you do not need to adjust them. For the false positive cases, I guess they are corner cases in which the target pose drives the hand to penetrate the object.

Thank you very much for your response! I've also noticed something strange. When validating graspdata results with validate_grasps.py, it shows successful grasp validation as "estimated: 279/500, simulated: 71/500, valid: 48/500". However, when visualizing with "--index 0" using isaacgym, the results appear different from those visualized using visualize_result.py. It seems like the visualization results from isaacgym are incorrect. This strange phenomenon doesn't occur with the Shadow hand, only with my custom hand. I'm unable to identify the reason behind this. Do you have any insights into what might be causing this? Thank you very much! 20240718-162556 20240718-162601

jianguo-cmd commented 1 month ago

It seems strange. I have finetuned default parameters, so you do not need to adjust them. For the false positive cases, I guess they are corner cases in which the target pose drives the hand to penetrate the object.

I'm encountering issues with my generated grasp poses sometimes being incorrect and experiencing penetration. I have a few questions:

1.Why do the generated grasp poses sometimes appear to not effectively grasp the object? 2.The hand should face the object, but different target objects may require different initial hand positions and orientations. How should I set the initial position and orientation of the hand? 3.Is it necessary to set contact points and penetration points for the palm of the hand? 4.To generate different grasp poses, should I modify the 'seed' value, where each seed corresponds to a different grasp pose? I look forward to your response. Thank you very much!

wrc042 commented 1 month ago

You mentioned that "this strange phenomenon doesn't occur with the Shadow hand, only with my custom hand." What modifications did you make to your custom hand? I suspect there might be some issues. My initial guess is that there could be differences in the hand model between your visualization and IsaacGym. At least they seem to have different size.

wrc042 commented 1 month ago

I try to help with your issues:

  1. This question seems to be too general and hard. What do you mean for "not effectively"?
  2. We proposed a general initialization algorithm, which makes the hand palm always points roughly toward the center of the object. There are some description in the article (Fig. 2 (c)). Maybe you can check it.
  3. I think the points on the palm are necessary.
  4. To generate different grasp poses, we apply different initial poses for optimization. Random seed can influence initial poses. So changing seed would create different grasp poses after optimization. You are right.
jianguo-cmd commented 1 month ago

You mentioned that "this strange phenomenon doesn't occur with the Shadow hand, only with my custom hand." What modifications did you make to your custom hand? I suspect there might be some issues. My initial guess is that there could be differences in the hand model between your visualization and IsaacGym. At least they seem to have different size.

Thank you, this issue may now be resolved. It seems I had misunderstood earlier. Isaacgym validates 500 grasp poses and determines which ones are successful, so the visualizations from isaacgym might include false positives. On the other hand, visualize_result.py shows results that have already been validated from the dataset as successful, hence its visualizations appear reasonable.

jianguo-cmd commented 1 month ago

It seems strange. I have finetuned default parameters, so you do not need to adjust them. For the false positive cases, I guess they are corner cases in which the target pose drives the hand to penetrate the object.

Thank you very much for your response. The false positives were due to errors in my hand model, which are now corrected.

jianguo-cmd commented 1 month ago

You mentioned that "this strange phenomenon doesn't occur with the Shadow hand, only with my custom hand." What modifications did you make to your custom hand? I suspect there might be some issues. My initial guess is that there could be differences in the hand model between your visualization and IsaacGym. At least they seem to have different size.

In actual generated grasp poses, it's difficult to grasp cylindrical objects directly aligned with their axis (such as water bottles; generated poses rarely grasp the body of the bottle). Is this because, as shown in Figure 2(c), "Initialization: sample points on the object's inflated convex hull"?

联想截图_20240723172611
wrc042 commented 1 month ago

Yes, I think this should be one of the reasons. We also noticed this situation. I guess the grasps of bottle bodies may be more difficult to generate by our algorithm and are considered to be unstable and penetrating poses.