Closed rudrapoudel closed 2 years ago
Hey
I had a similar question back then. I found stable_baselines3_example.py in igibson/examples/demo. Which uses PPO in SB3 and provides the agent RGB + Depth information in addition with some proprioceptive inputs.
Hi @rudrapoudel and @sycz00 , We had two navigation tasks in iG1: object navigation (to a lamp) and point navigation based on Lidar. I could ask the students to share the old code, however, the API has changed in iGibson2 and may be hard to use as is.
Nevertheless, @sycz00 is right: the code in examples/learning/stable_baselines3_example.py should get you started to a similar point. You could change the observations and/or the scene generation to obtain similar policies as in iG1.
About the second question: yes, it should converge as is.
I hope this helps!
Couldn't find the code of robot navigation from iGibson 1.0 paper,
Thank you for awesome simulator!