Open maraoz opened 4 years ago
I found the --video-dir
option in rollout.py
, but it's only creating some .json files when I use it.
@maraoz If you are using procgen==0.10.x, you need to pass "render_mode": "rgb_array"
as env config. There is a bug in the gym3
-> gym
conversion. We added a temporary fix for that in https://github.com/AIcrowd/neurips2020-procgen-starter-kit/pull/24.
We generate videos for every few iterations during the evaluations and post the videos on the issues page and the dashboard.
At present the fix suggested in #24 is not working for me.
I've only been able to get procgen render to work under gym3 via:
from gym3 import types_np
from gym3 import VideoRecorderWrapper
from procgen import ProcgenGym3Env
env = ProcgenGym3Env(num=1, env_name="coinrun", render_mode="rgb_array")
env = VideoRecorderWrapper(env=env, directory=".", info_key="rgb")
step = 0
while True:
env.act(types_np.sample(env.ac_space, bshape=(env.num,)))
rew, obs, first = env.observe()
print(f"step {step} reward {rew} first {first}")
if step > 0 and first:
break
step += 1
But the starter kit uses gym, so the above approach does not work
I've been able to get the repo up and running, and could
train
androllout
successfully, but I didn't understand too much what was going on. Is it possible to actually visually see the agent playing the games? Thanks!