leondgarse / keras_efficientnet_v2

self defined efficientnetV2 according to official version. Including converted ImageNet/21K/21k-ft1k weights.
Apache License 2.0
78 stars 19 forks source link

Best configuration for finetuning #4

Closed Jogima-cyber closed 3 years ago

Jogima-cyber commented 3 years ago

Do you have an idea for the best configuration for finetuning ? Which optimizer, which LR reduction strategy, need for dropout, regularization, need for weird learning technics?

Jogima-cyber commented 3 years ago

Btw for my current configuration I'm using this LR reducer with adam optimizer, LR set to 0.001, and regular tf training :

lr_reducer = tf.keras.callbacks.ReduceLROnPlateau(
        monitor="val_classification_loss", patience=3, min_lr=1e-6, mode='min')
leondgarse commented 3 years ago
Jogima-cyber commented 3 years ago

Thank you very much for your insights!