Closed XiaoqiangZhou closed 4 years ago
Besides, there is no softmax or tanh operation after the final convolution. Is it correct?
Hi, thank you for help us find the problem.
@steven413d got it! Thanks~
@steven413d Could you please explain what role dose self.G.train()
play in trainer.py
, line 82?
I can't find the defination of function train
in class 'unet'.
Does this function is inherited from nn.Module
? What will happen if we remove self.G.train()
?
Thanks for your patience~
https://stackoverflow.com/questions/51433378/what-does-model-train-do-in-pytorch But model.train() is default mode. I think removing it is OK.
@steven413d Thanks for your patience~
Hi, it seems you didn't use ReLU and BN in the decoder side. Did you implement in this way purposely?
In the defination of unetUp,
self.conv = unetConv2(in_size, out_size, False)
, whereFalse
meansis_batchnorm=False
Thanks.