Closed chadcwilliams closed 2 months ago
After diving deep into training procedure torch.nn.dropout seems to not take the manually set seed into account. Maybe this changes with later versions. Implemented it now anyways for GAN, AE and generate_samples
I think that we should have seed as a parameter so that users can set their own rather than always defaulting to 42.
There looks to be a history of torch.nn.dropout
seed problems:
https://stackoverflow.com/questions/75537766/make-nn-dropout-exactly-reproducible-for-same-seed-between-cpu-and-cuda-device
Scouring the net, it looks like it should be fixed by now: https://discuss.pytorch.org/t/does-random-seed-in-pytorch-has-the-effect-on-the-function-of-drop-out/24741/2
I did encounter this function in an answer that could be useful? https://stackoverflow.com/a/67239038
def setup_seed(seed):
random.seed(seed)
numpy.random.seed(seed)
torch.manual_seed(seed)
torch.cuda.manual_seed(seed)
torch.cuda.manual_seed_all(seed)
torch.backends.cudnn.deterministic = True
setup_seed(42)
I implemented the above and everything including nn.torch.dropout
is now reproducible.
Noticed that generate_samples_main also needs a seed fix
Complete with #102
We should add a seed parameter for reproducibility .