jtoy / sensenet

SenseNet is a sensorimotor and touch simulator for deep reinforcement learning research
Other
57 stars 20 forks source link

touch / camera sensor needs to be more accurate #5

Closed jtoy closed 6 years ago

jtoy commented 6 years ago

it works on certain angles, but not all angles. need to test and verify this. this is how the could should work:

You will need to compute the viewmatrix, based on the index finger world transform matrix. You can get this using the pybullet.getLinkState API. Then extract the position and orientation form that, and build the view matrix using pybullet.calculateViewMatrix. It is a bit complex, the racecar example (https://github.com/bulletphysics/bullet3/blob/449c8afc118a7f3629bc940c304743a084dcfac6/examples/pybullet/gym/pybullet_envs/bullet/racecarZEDGymEnv.py#L95) computes the camera based on the chassis. The finger world transform 3x3 matrix is basically fwd, left, up vectors.

jtoy commented 6 years ago

this can be "unit tested" with scripts/touch_test.py

jtoy commented 6 years ago

use depth camera, also try smaller camera.

jtoy commented 6 years ago

much better now