Open dstamos opened 4 years ago
I also found the memory footprint seems to be constantly increasing. My temporary solution for this is to set ray.remote(max_calls=1)
in the function decorator, and thus force ray
to open a new worker to do the task each time.
But of course, certain overhead is included each time ray
starts a new process worker.
Could you use the latest wheel or a version 0.8.3 (it will be released by next week). This might resolve this problem.
Ray version: 0.8.0 I am using a machine that has 220GB of RAM and is running Ubuntu 16.04.4.
I would like to run the following script and keep the memory footprint to the very minimum since all I need is the latest 'new_results'. When I look at htop it appears that the memory usage keeps increasing at each iteration. Why is that? Is there an appropriate setting for the ray.init() parameters that would prevent this? Is there another workaround?
I have considered ray.shutdown() at the end of each iteration but this seems to be unstable/unreliable. I might create a separate post about this later.