facebookresearch / torchbeast

A PyTorch Platform for Distributed RL
Apache License 2.0
734 stars 113 forks source link

Excessive memory use in monobeast.py #44

Open baskuit opened 1 year ago

baskuit commented 1 year ago

My collaborator and I both experience this issue.

python -m torchbeast.monobeast --env PongNoFrameskip-v4 --num_actors 1 --num_buffers 2 --total_steps 5000000

The memory in the learner processes will increment occasionally until it chokes are machines (8 Gb memory, 5Gb for the learner process) at around 1 million steps. I spoke to someone in discord about this, and they did not have this issue. However I note that the graph of memory usage they posted showed around 300 Mb of use, which seems too small (but I suppose its really net dependent)