openai / procgen

Procgen Benchmark: Procedurally-Generated Game-Like Gym-Environments
https://openai.com/blog/procgen-benchmark/
MIT License
1.01k stars 209 forks source link

How to decrease the size of environment output #54

Closed shizhec closed 4 years ago

shizhec commented 4 years ago

Hi, I was trying to train an IMPALA agent with intrinsic rewards on the Procgen Env. Unfortunately, with the original environment output. My model ran out 113GB memory within 2M frames training. And that's the maximum memory I can request. So I wonder if there are some modification that I can made to the environment to reduce the environment output size? (e.g. reduce the buffer size in my model) I figure out some possibly feasible way:

  1. reduce the observation shape from (64, 64, 3) to (32, 32, 3)
  2. set the background to none
  3. use restrict_themes and monochrome_assets.

could the above option work, and are there other options?

christopher-hesse commented 4 years ago

I'm not sure why you would run out of memory, are you using a replay buffer of unbounded size?

shizhec commented 4 years ago

I'm using the torchbeast: monobeast version of the IMPALA implementation. I'm sure the buffer size is limited

christopher-hesse commented 4 years ago

My first guess is that your replay buffer is too large and you should make it smaller and/or resize the observations with an environment wrapper. What's using all the memory in your case?

shizhec commented 4 years ago

I tried to decrease the buffer size, but I still got OOM when training 2M frames with 113GB memory. I've scaled down the observation shape, Now I could train 5M frames with 113GB memory. So I guess the environment output or buffer didn't get freed during trainning?

christopher-hesse commented 4 years ago

Yeah, you should profile the memory usage. Sounds like you have a leak in your code or else it's just storing a very large amount of data. There was a memory leak from gym3 when procgen was rendering to a window, but that was fixed in the most recent version.

shizhec commented 4 years ago

Thanks I will do that! close this issue now