Open ghost opened 3 years ago
For small datasets, consider pre-training on similar public datasets and then fine-tune it. You may also try to reduce the learning rate, which maybe helpful.
Hi, Thanks for you response.
Sadly, there aren't any public datasets available for the use case I am working on.
I was thinking the following;
Any feedback is much appreciated.
Hi,
Apologies if this has been asked before.
I have a custom data with ~700 tagged images, total number of classes are 15.
I have trained a model using the following combination (this gives the best results so far);
python train_dist.py --dataset ADE20K --model EncNet --aux --se-loss --backbone resnest101 --epochs XXX
It is giving okay-ish results. I understand the data is too less to expect pretty good results but the classes are rather simple and I cannot afford more data. I want to fine-tune (or maybe overfit) to my custom data for demonstration purpose. But, whatever I try so far, it just doesn't get any better.
Can you suggest any best practices, suggestions on how can I fine-tune to my particularly small data-set?
Thank you!