Closed alietestep closed 4 years ago
I'm also interested in having a custom ENV for robotic simulation. One example would be having Dreamer work with Gazebo.
Options that might work:
Gym environments should work. However, they need to return dictionary observations that contain an image
key. You can use a simple env wrapper if your environment doesn't use dictionary observations. Also check out the newer code base that runs faster and is quite a bit easier to work with: http://github.com/danijar/dreamer
@danijar thanks for replay, i'm working on RL control in Airsim simulator. I've done some tests with DDPG, D4PG and others but they all time/resource consuming and with complex tasks,, actually not so good results. but Dreamer seems to be interesting if can control with much less interaction and resource. so i'm gonna go through simpler code and will get back with results.
Sounds good, let me know if you run into problems setting it up and I'll see if I can provide more suggestions (especially if you're using the new code base).
hi, thanks for great work. i'm seeking a workaround for using Dreamer on custom ENVs. ENVs are wrappers like gym and have almost all gym's methods . i'm talking about robotic simulators that u can get sensor inputs like picture and so on. i have looked the code for couple of days ,its complicated and get no success there yet. any tutorial, example, tips or walk through would be great.