AndrejOrsula / drl_grasping

Deep Reinforcement Learning for Robotic Grasping from Octrees
https://arxiv.org/pdf/2208.00818
BSD 3-Clause "New" or "Revised" License
404 stars 54 forks source link

Adding static Objects to the Reach Task #100

Closed Nils-ChristianIseke closed 2 years ago

Nils-ChristianIseke commented 2 years ago

Hey, i am trying to modify the Reach tasks by adding random static objects as Obstacles. Are there any tips of how to add those? #

AndrejOrsula commented 2 years ago

You can try taking a look at insertion of objects inside drl_grasping/envs/randomizers/manipulation.py (e.g. randomize_object_models() or randomize_object_primitives()). I don't fully remember the state of code at version 1.1.0, so it might not be super modular... but you will see.

The list of currently available models (including objects) is inside drl_grasping/envs/models directory. You can add custom ones if you need (by using any of the current models as a template), or just use drl_grasping/envs/models/random_object.py with a specific dataset of SDF models (from Fuel or local Fuel cache).

If you add objects, visual observations should be usable straight away (colour and depth image obs). If you want to utilise state-based observations (vanilla Reach task), then you probably want to add these obstacles into the observation space somehow (here and here). I should note that I have not experimented a lot with the Reach task, as I only used it for quick sanity tests during the early development. It was much simpler to solve this task (get decent results) compared to the Grasp task though (quite simple to solve since actions are in Cartesian space).

Nils-ChristianIseke commented 2 years ago

Thanks for your tips. :)