Open robertokcanale opened 2 years ago
Just an example of a bad grasp. Here I include an STL file as the URDF visual and collision Geometry, but i get nothing from it
Would you give it a try in https://github.com/facebookresearch/tacto/blob/main/examples/demo_pybullet_digit.py It might be easier for debugging by dragging the object around.
It might be caused by the collision detection in PyBullet has some small difference with the one in PyRender. For debugging purposes:
Thank you for your reply. I am trying to work with pybullet as suggested.
Collision removal does not work
I get similar results in changing the global scale of collision and visuals
I am beginning to wonder if my issue is the .obj and .mtl files that I am using. Do you have any other valid examples of files that I could use? Or anything else in mind? As I tried to follow closely the provided examples, it just seems not to work
Hello, I am trying to collect tactile data and images during grasping from the YCB dataset. However, I am trying to simulate the contact but this does not happen, as it is possible to see from the screenshot:
My steps are the following: -Add the folder with .stl, .obj, .mtl to the /object folder -create a .urdf very similar to the ones already provided (see attachment) -edit the .yaml parameters for importing the correct urdf, the scale, and the position The URDF is the following:
The grasp.yaml is the following: `hydra: run: dir: ./
object: urdf_path: "objects/tomato_soup_can.urdf" base_position: [0.50, 0, 0.0] global_scaling: 1
tacto: width: 120 height: 160 visualize_gui: True
sawyer_gripper: robot_params: urdf_path: "robots/sawyer_wsg50.urdf" use_fixed_base: True init_state: end_effector: position: [0.50, 0, 0.215]
p.getQuaternionFromEuler([0, np.pi, 0])
pybullet_camera: cameraDistance: 0.6 cameraYaw: 15. cameraPitch: -20. cameraTargetPosition: [0.5, 0, 0.08] `