jakeret / tf_unet

Generic U-Net Tensorflow implementation for image segmentation
GNU General Public License v3.0
1.9k stars 748 forks source link

Strange behavior, training improvement #231

Closed rytisss closed 5 years ago

rytisss commented 5 years ago

Hi, thank you for UNet implementation. I have tried it and it trained quite well. I am writing regards of my results. There is few of my result: epoch_1 epoch_1 epoch_2 epoch_2 epoch_3 epoch_3 epoch_4 epoch_4 epoch_5 epoch_5 epoch_6 epoch_6 epoch_7 epoch_7 epoch_8 epoch_8 epoch_9 epoch_9 epoch_10 epoch_10

I have ~1200 image with defects of one class that i want to recognize. For the training (the picture with prediction of epochs are seen above) i used UNet with 4 layers and feature_root=16. Kept my minibatch at 2 and training_iters=600 to cover all dataset at single epoch.

  1. Is there any tricks besides increasing feature_root or layers to increase precision/accuracy?
  2. Is behavior of network normal like it is show in epoch_6.jpg to epoch_7.jpg to epoch_8.jpg when it seem like prediction goes out, then it normalizes?
  3. Should 1 epoch cover all dataset or it can cover less?

P.S. I implemented fix mentioned in #228 in code that i used

jakeret commented 5 years ago

Hi, glad that the implementation is helping you. As for your questions:

  1. there are other hyperparams you can tune such as the learning rate, training algorithm, loss function, image pre-processing etc. There is no definite answer which works best. Using your domain knowhow and exploration is neccessary
  2. Yes this can happen when the model adapts the weights over the training. As long it only happens sporadically and normalizes afterwards it should be fine.
  3. In principle in can cover less but the more data the network sees the better
rytisss commented 5 years ago

Thank you for answers :)

I have noticed that when i have increased feature_root to 32, network trains more evenly, i mean in every epoch you can see on training batch prediction that feature comes more strong.

In general, it does all the work nicely!