Open gngdb opened 9 years ago
Now seems a good approach to avoid overfitting and improve model performance is to run with as much augmentation as possible (but not so much as to badly distort images). Then, we can avoid using as much dropout, which probably increases training time and reduces model flexibility (although it's also supposed to improve robustness of features learned).
Online Augmentation is now as good as dense. We've run some models with larger number of augmentations, but with mixed results. Post links to results here if you have.
Seems like we're going to have to be more careful about which augmentations to include. Some may be making the task more difficult. Also, we should consider some changes to the model to speed up learning.
Increasing the augmentation should help deal with overfitting without using dropout, so with these models we can probably ease off on the dropout and increase the MLP and Convolutional Layers.