-
I installed dm_control using pip and I was attempting to try it out with [this repository](https://github.com/danijar/dreamerv2). When I ran this command: `python3 dreamerv2/train.py --logdir ~/logdir…
-
We could successfully run dreamerv2 on the Minigrid environment by referring to the README.md. And, we are now trying to run dreamerv2 on atari games, but the environment loaded from an atari game, es…
-
https://github.com/danijar/dreamerv2/blob/e02ceb9a77596e695a66d099e749b6bdf62db50e/dreamerv2/train.py#L129
Since the number of evaluation environments is ```num_eval_envs```, I think there should b…
-
https://github.com/danijar/dreamerv2/blob/912ec5da79467b22917cce683c776f034850f91d/dreamerv2/configs.yaml#L24
Hi Danijar,
do I understand correct that this line should have batch = 50 to to have s…
-
It seems like the state of the agent (`self._state`) is not initialized to 0 on reset. Only in the very first episode, it is `None`, so it will be set to 0s. Since `driver.reset()` is never called aga…
edwhu updated
2 years ago
-
When running `python3 common/plot.py --indir ~/logdir/exp --outdir ~/plots --xaxis step --yaxis eval_return --bins 1e6` I get:
`NotADirectoryError: [Errno 20] Not a directory: '/home/USER/logdir/ex…
-
Hi Danijar,
I'm currently doing a project where I'm running DreamerV2 on some of the alternative exploration agents. I have two questions:
1. How does train_dataset update to include samples from …
-
This is the commandline and output i get:
`(tf2) marten@dpserver:~/rl/dreamerv2$ python3 dreamerv2/train.py --logdir ~/logdir/dmc_walker_walk/dreamerv2/1 --configs dmc --task dmc_walker_walk
Trace…
-
Hello, danijar! First of all, thanks for your work :)
I've been trying out dreamerv2 this past week and tried to reproduce riverraid's results. However, I was unsuccessful and the agent only reache…
-
Hello,
Thanks for the code! Just want to confirm, are the scores in [`dreamer.json`](https://github.com/danijar/dreamer/blob/56d4d444dfd0582b0e79dab80aebbea74c0ce40d/scores/dreamer.json) from …