Hello,Thanks for the great work, I'm confused about the random seed setting. I noticed that in trainer_helper.py the random seed is reset before each epoch training as follows:
for epoch in range(start_epoch, self.cfg['max_epoch']):
# reset random seed
# ref: https://github.com/pytorch/pytorch/issues/5059
np.random.seed(np.random.get_state()[1][0] + epoch)
# train one epoch
self.train_one_epoch(epoch, tb_writer)
while in train_val.py it seems that the random seed has been fixed by "set_random_seed(cfg.get('random_seed', 444))".
So why reset the random seed after fixing?
Hello,Thanks for the great work, I'm confused about the random seed setting. I noticed that in trainer_helper.py the random seed is reset before each epoch training as follows:
while in train_val.py it seems that the random seed has been fixed by "set_random_seed(cfg.get('random_seed', 444))". So why reset the random seed after fixing?
Looking forward to a reply,and thanks!