Open Paulito-7 opened 6 years ago
The only explanation that I can think of is if your validation set is small and not representative of the overall accuracy, or that your testing on a few images and those are too few to be a good representation of the overall accuracy of the model.
Thank you for your answer. I will try to increase the size of my dataset and will keep you updated!
what's the update?
@Paulito-7
update sir
@waleedka First I would like to say thank you for this great implementation of Mask R-CNN! I am currently working on my own dataset but I noticed a strange behavior when training. I divided my training stage as follows, using a lr_decay in
model.py
model.train(train_dataset, val_dataset, learning_rate=config.LEARNING_RATE, epochs=50, layers='heads')
model.train(train_dataset, val_dataset, learning_rate=config.LEARNING_RATE/10, epochs=70, layers='4+')
model.train(train_dataset, val_dataset, learning_rate=config.LEARNING_RATE/10, epochs=90 layers='3+')
Myval_loss
goes up when training lower stages, but the result, when testing on some images, actually improves. Does it come from the way that are taken into account false positive / false negatives in the loss computation or is it something else? Have you ever met that matter?Thank you !