AutoResearch / EEG-GAN

Other
19 stars 1 forks source link

Set seed #67

Closed chadcwilliams closed 2 months ago

chadcwilliams commented 8 months ago

We should add a seed parameter for reproducibility .

whyhardt commented 3 months ago

After diving deep into training procedure torch.nn.dropout seems to not take the manually set seed into account. Maybe this changes with later versions. Implemented it now anyways for GAN, AE and generate_samples

chadcwilliams commented 3 months ago

I think that we should have seed as a parameter so that users can set their own rather than always defaulting to 42.

There looks to be a history of torch.nn.dropout seed problems: https://stackoverflow.com/questions/75537766/make-nn-dropout-exactly-reproducible-for-same-seed-between-cpu-and-cuda-device

Scouring the net, it looks like it should be fixed by now: https://discuss.pytorch.org/t/does-random-seed-in-pytorch-has-the-effect-on-the-function-of-drop-out/24741/2

I did encounter this function in an answer that could be useful? https://stackoverflow.com/a/67239038

def setup_seed(seed):
    random.seed(seed)                          
    numpy.random.seed(seed)                       
    torch.manual_seed(seed)                    
    torch.cuda.manual_seed(seed)               
    torch.cuda.manual_seed_all(seed)           
    torch.backends.cudnn.deterministic = True  

setup_seed(42)
chadcwilliams commented 3 months ago

I implemented the above and everything including nn.torch.dropout is now reproducible.

chadcwilliams commented 3 months ago

Noticed that generate_samples_main also needs a seed fix

chadcwilliams commented 2 months ago

Complete with #102