georgymh / decentralized-ml

Interoperable and decentralized machine learning.
Apache License 2.0
9 stars 5 forks source link

Multithreaded scheduler (redo) #5

Closed georgymh closed 6 years ago

georgymh commented 6 years ago

From @neeleshdodda44's reverted pull request #3:

Comments in code are hopefully pretty self-explanatory, but to summarize + extra details:

  • Using Ray to run runners asynchronously
  • DMLRunnerActor is just a wrapper for DMLRunner (didn't want to mess with the tests for DMLRunner, which would clearly fail if I annotated DMLRunner as the actor)
  • Can run with as many runners as you want (default is 4), but ran into some weird errors when I was using too many with little speed boost. Feel free to change the default.
  • The test for speedup is fairly naive (Georgy mentioned that the scheduler is throwaway, so I didn't want to test by hardcoding stuff).
  • New to pull requests so I don't know if I'm the one merging...? If not, I'm fairly sure you can just add the new test file, add the new actor class, and replace the scheduler code with mine (unless there were significant changes to runner.py or smt
georgymh commented 6 years ago

@neeleshdodda44: Please see the comments above.

kiddyboots216 commented 6 years ago

@georgymh fyi i'm not approving PRs from now on so if this is good you should merge it

georgymh commented 6 years ago

Small fix and merged. Good job @neeleshdodda44