lg-zhang / dynamic-soft-margin-pytorch

20 stars 6 forks source link

Running the script multiple times, the results are different. #3

Open lovekittynine opened 4 years ago

lovekittynine commented 4 years ago

Hello, thank you! I get some questions when i run your script. detailed: Running the script multiple times, the results are different.

Looking forward to your reply! Thank you!

lg-zhang commented 4 years ago

Do you mean training or inference? Inference results should not change unless your update PyTorch or cudnn. Training results are not guaranteed to be consistent (because it's SGD) but they should be close enough.

lovekittynine commented 4 years ago

thanks for your reply!In training stage, even though i set random seed,the results are different each time!Other works,such as Hardnet,it can get same training results,when set identical random seed.

lg-zhang commented 4 years ago

Could you specify what random seeds have you set? To make training deterministic, normally you need to fix seeds for pytorch, random, np.random and also set cudnn to deterministic mode.

lovekittynine commented 4 years ago

np.random.seed(0) torch.cuda.manual_seed(0) torch.backends.cudnn.deterministic=True

lg-zhang commented 4 years ago

Could you try: random.seed(0)?