alexgkendall / SegNet-Tutorial

Files for a tutorial to train SegNet for road scenes using the CamVid dataset
http://mi.eng.cam.ac.uk/projects/segnet/tutorial.html
847 stars 517 forks source link

Testing Net at Training time, regarding Batch_Normalization #112

Open tenacious-b opened 6 years ago

tenacious-b commented 6 years ago

Good day,

I understand that after training the network, compute_bn_statistics.py must be run in order to recalculate the weights and only then I can use the inference.prototxt to test the network. I would like to plot the test loss in the same way the training loss is plotted while training, so I added to my train.prototxt include { phase: TEST }

For my second data layer which contains the txt file that points to the test images, and let caffe run this test phase every n number of iterations. However, I would think that during this TEST phase, no batch normalization is being used, since compute_bn must be run independently at the end!, right? which means the reported TEST accuracy and loss while training is not correct!, right?

  1. Is there any solution or idea to tackle this problem?
  2. What is the advantage of computing BN for testing phase offline? (via compute_bn_statistics.py)
  3. I understand caffe's BN layer does the same, but online. Does anyone have an example on how to write the train.protxt using caffe BN layer instead of SegNet's?

Thank you very much for your time, CG