Closed RongXueZoe closed 1 week ago
Hi,
thank you for being so detailed.
The environment used in 1_a_minimal_DBC_implementation.py
is d4rl Kitchen, which differs from the RelayKitchen used in dbc_kitchen.py
. Therefore, we cannot mix their rendering methods.
We will implement a unified logger for saving and rendering as soon as possible.
In the meantime, we recommend using an alternative approach: you can obtain an RGB image by using env.sim.render()
for d4rl Kitchen environment. You can then use libraries like imageio
to compose a video from the saved images of an episode.
Thank you once again for your inquiry. We will work on addressing this issue.
This involves these things:
kitchen
dataset from D4RL for the RL algorithm and the Relay-kitchen dataset for the IL algorithm, and even though both datasets have the same underlying physics engine, they still have parts of the API that are not exactly the same, such as rendering.VideoRecorder
.logger
and render
. Hi,
thank you for being so detailed. The environment used in
1_a_minimal_DBC_implementation.py
is d4rl Kitchen, which differs from the RelayKitchen used indbc_kitchen.py
. Therefore, we cannot mix their rendering methods. We will implement a unified logger for saving and rendering as soon as possible. In the meantime, we recommend using an alternative approach: you can obtain an RGB image by usingenv.sim.render()
for d4rl Kitchen environment. You can then use libraries likeimageio
to compose a video from the saved images of an episode.Thank you once again for your inquiry. We will work on addressing this issue.
We recommend using direct calls to the original environment API as an alternate solution, and we will continue to optimize the CleanDiffuser.
I have added some code for visualization in tutorial 1. You can pull the latest version to try it out. I will close this issue for now. If you have any further questions, please feel free to reopen it.
Thank you for your detailed reply! The latest version works fine. Thanks again for your great work:)
Hi,
I've trained several agents following the tutorials and now I want to render videos like
dp_pusht.py
does. Since the env used in tutorial 1 iskitchen-complete-v0
, I modifieddbc_kitchen.py
as follows:env_name
fromkitchen-all-v0
tokitchen-complete-v0
and import d4rl in advance.1_a_minimal_DBC_implementation.py
.use_abs_action
arg ingym.make(args.env_name, use_abs_action=args.abs_action)
.However, it seems that when initializing the env, simply changing the
env_name
is not enough. The error goes:I'll appreciate it if you could tell me how to render a video like the push T task does, using my own trained model.