ShangtongZhang / DeepRL

Modularized Implementation of Deep RL Algorithms in PyTorch
MIT License
3.18k stars 682 forks source link

set_one_thread() in example.py #63

Closed jyf588 closed 5 years ago

jyf588 commented 5 years ago

Hello Shangtong,

Sorry to interrupt you but I am new to pytorch. A quick question - what is the purpose of set_one_thread() in example.py? Is it simply equivalent to setting num_workers = 1 ? Am I understanding your code correctly that every time I want to run an algorithm with num_workers>1, I should comment out set_one_thread()?

ShangtongZhang commented 5 years ago

No. Always use set_one_thread. It's for OpenMP and MKL.

jyf588 commented 5 years ago

So if I want to use multiple (>1) workers/cores, I should comment out set_one_thread, right?

ShangtongZhang commented 5 years ago

no. never comment it out.

Shangtong Zhang Sent from my iPhone

On Aug 20, 2019, at 18:05, jyf588 notifications@github.com wrote:

So if I want to use multiple (>1) workers/cores, I should comment out set_one_thread, right?

— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub, or mute the thread.

jyf588 commented 5 years ago

I should have made this clearer - the reason I was asking is that, when I run python example.py, even if num_workers>1, the machine still only uses one core. I can only get my machine use multiple cores with set_one_thread commented out.

Am I misunderstanding how to use your code? Thanks.

ShangtongZhang commented 5 years ago

'num_workers > 1' does not necessarily mean we need to start multiple threads. I use one single thread to simulate multiple workers. This usually makes better use of CPU resources — you can run multiple seeds simultaneously. Unless you have hundreds of CPUs, I do not expect the ‘real multiple workers’ can help in terms of overall experiment time. If you want to use real multiple workers, set this flag to False (but I haven’t used it for quite long time): https://github.com/ShangtongZhang/DeepRL/blob/master/deep_rl/component/envs.py#L157 https://github.com/ShangtongZhang/DeepRL/blob/master/deep_rl/component/envs.py#L157

If you comment ’set_one_thread’ out, it means you allow pytorch to do things like matrix multiplication with multiple threads — it has nothing to do with multiple workers. I find this does not help — I would rather run multiple seeds simultaneously instead of doing matrix multiplication with multiple threads.

Shangtong Zhang, DPhil Student, Department of Computer Science, University of Oxford

On Aug 20, 2019, at 20:49, jyf588 notifications@github.com wrote:

I should have made this clearer - the reason I was asking is that, when I run python example.py, even if num_workers>1, the machine still only uses one core. I can only get my machine use multiple cores with set_one_thread commented out.

Am I misunderstanding how to use your code? Thanks.

— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub https://github.com/ShangtongZhang/DeepRL/issues/63?email_source=notifications&email_token=ABTHN3Q4HCKIRR32CAY5KUDQFRDEJA5CNFSM4INO4SFKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD4XORHQ#issuecomment-523167902, or mute the thread https://github.com/notifications/unsubscribe-auth/ABTHN3XQHFGOLZFMMFKTFJTQFRDEJANCNFSM4INO4SFA.

ShangtongZhang commented 5 years ago

Now this set_one_thread is removed, and that functionality is implemented in https://github.com/ShangtongZhang/DeepRL/blob/master/docker_python.sh#L9