juliandewit / kaggle_ndsb2

3rd place solution for the second kaggle national datascience bowl
Apache License 2.0
93 stars 53 forks source link

Several questions regarding your solution #4

Open wenouyang opened 8 years ago

wenouyang commented 8 years ago

Hi Julian,

Thanks for sharing the code. I have several questions after reading your solution document.

According to U-net paper, the output map is of size (row, column, 2), i.e., it has two feature channels. But looks like you only use 1 channel. Is that right? Would you like to explain more on this?

You once mentioned that "Segmentation nets are numerically unstable", would you like to elaborate more for this point? Are there any references discussing this?

You mentioned that "Note that I used relu activations and batch normalization after every convolution". With respect to "batch normalization" here, do you mean you will add a normalization layer after convolution layer? If I remember correctly, I once heard that "normalization layer" may not be needed if we use batch normalization in the optimization method. In specific, I am not very clear what do you mean "use batch normalization after every convolution".

How many epochs do you use for training?

Thanks for the help.

juliandewit commented 8 years ago

Hello,

  1. I just do (logistic) regression for every pixel so I just need one channel.. Perhaps they use softmax for every pixel ? Softmax needs 2 values per pixel.
  2. At first I had a pretty hard time to get learning happening. Many "all zero" results and or "NaN" In the paper they say that they need careful initialization so I guess they had the same problems. Note that for instance the number of '1's is much lower than the number of '0's. Batch normalization + low learningrate helped to overcome most of the problems. Without BN it's pretty hard to get the network to be "stable" and "learn". Although in the end my net worked without BN.
  3. I used Batch Normalization layers. These layers I put after the convolutional layers.
  4. 50 epochs should be more than enough..

Good luck.