MichiganCOG / ViP

Video Platform for Action Recognition and Object Detection in Pytorch
MIT License
220 stars 37 forks source link

Allow Unseeded Training #23

Closed lemmersj closed 5 years ago

lemmersj commented 5 years ago

Sometimes it is necessary to produce networks seeded randomly (for showing robust performance, or for ensembling). It would be nice to be able to do this without changing the config at each launch, especially if there is a delay between sending the start command and actually launching the program.

zeonzir commented 5 years ago

A randomly seeded experiment should be completely possible. However, the fixed seed is necessary to recreate similar situations and repeatable experiments. I would suggest a random number generation scheme as input to the pbs script as opposed to completely removing or giving an option to avoid the seed. Let me know what you think.

lemmersj commented 5 years ago

I agree with the rationale behind having the fixed seed as a capability, but I feel that the seed being random might be an expected capability (or even an expected behavior --- someone who doesn't read the yaml file too closely might not realize the random number generator is seeded by default).

I also think implementation would be very straightforward --- when the json is read, check if the seed is set to a designated "random" value. If so, set seed to the current timestamp.

ehofesmann commented 5 years ago

We are going to leave it up to the user to manually change the seed which can be automated using the --seed argument. If you still want to remove the seed then you can comment out the line at the bottom of train.py. There is currently a bug though that doesn't seed numpy so all of your experiments up to this point have been random anyway #25