titu1994 / DenseNet

DenseNet implementation in Keras
MIT License
707 stars 294 forks source link

I can not get good performance #19

Closed FrancisYizhang closed 7 years ago

FrancisYizhang commented 7 years ago

Dear @titu1994

I run you code, but I can not get good performance like the paper, Why?

The parameters are listed below:

batch_size = 64 nb_classes = 100 nb_epoch = 300

img_rows, img_cols = 32, 32 img_channels = 3

img_dim = (img_channels, img_rows, img_cols) if K.image_dim_ordering() == "th" else (img_rows, img_cols, img_channels) depth = 40 nb_dense_block = 3 growth_rate = 12 nb_filter = 12 bottleneck = True reduction = 0.2 dropout_rate = 0.2 # 0.0 for data augmentation optimizer = Adam(lr=1e-2) # Using Adam instead of SGD to speed up training

The result is loss: 2.3397 - acc: 0.4425 - val_loss: 2.5654 - val_acc: 0.4034 Accuracy : 40.34 Error : 59.66

Can you help me, thanks!

FrancisYizhang commented 7 years ago

Or, can you share your pre-train module to me, thanks a lot.

titu1994 commented 7 years ago

How long have you run it ? It takes nearly 300 epochs to train to completion.

Also, weights for Cifar 10 have been provided in the repo. Download the repo again to get the weights.