Open louxy126 opened 5 years ago
Hi!
Regarding dropout: The original paper does not state dropout in the architecture chapter or the figure you mentioned. Yet, the authors state in 3.1 Data Augmentation
"Drop-out layers at the end of the contracting path perform further implicit data augmentation."
If you take a look at the original resource from the authors, you can see that dropout is used twice, i.e. in the last two contracting paths (left side of the figure) -- that is where zhixuhao applied it as well.
Hi@zhixuhao,thanks for your code! I find some dropout layers in your code ,such as"drop4 = Dropout(0.5)(conv4)",but I do not find them on the picture of the net, did you add them by yourself? Also, the last two conv in your code are
conv9 = Conv2D(2, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(conv9) conv10 = Conv2D(1, 1, activation = 'sigmoid')(conv9)
but I think it should beconv9
= Conv2D(2, 1, activation = 'relu', padding = 'same', kernel_initializer ='he_normal')(conv9)
,because the last “→” is conv1*1 according to the picture. Am I wrong?