ac-93 / tactile_gym

Suite of PyBullet reinforcement learning environments targeted towards using tactile data as the main form of observation.
GNU General Public License v3.0
115 stars 20 forks source link

How to use different sensors? #11

Open keiohta opened 1 year ago

keiohta commented 1 year ago

Hi authors, thank you very much for sharing this awesome repo!

I'm currently using GelSight Mini from GelSight Inc., and am wondering if I can somehow use the codebase to simulate deformations of the GelSight Mini sensor.

I think the GelSight Mini sensor is very similar to Digit, so I imagine this simulator can simulate it by changing some parameters, such as parameters that define sensor size, geometries of the gel, depth of the gel, pose of the sensor, etc. Even though the simulation won't be perfect, it can provide us with some reasonable features such as edges.

So, could you share some workarounds or tips to adopt different simulators?

Thanks!

ac-93 commented 1 year ago

Do you have access to the mesh files (.obj, .stl, etc) for the Gelsight Mini?

The way we've done this is by attaching these meshes onto a robot arm within the .urdf (such as here).

We then create a sensor 'skin' and 'core' which approximate the shape of real sensor skin and gel. The skin is a thin layer that doesn't obscure the camera view and the core is a solid shape that does not contribute to any visual rendering. Then we place a virtual camera within this mesh in the same place it would be for the real sensor, tuning some parameters to get a reasonable match between real and sim.

It's worth noting that tactile_gym won't output images that appear similar to those from gelsight style sensors like the gelsight mini or digit. It's more focussed on contact geometry via depth imaging. With some domain adaption you might be able to bridge the gap though.