geoffwoollard / ece1512_project

0 stars 0 forks source link

test out different optimizers #12

Closed geoffwoollard closed 5 years ago

geoffwoollard commented 5 years ago

We are using SGD, but there are others: from keras.optimizers import SGD, RMSprop, Adadelta, Adam

They may need special arguments

Davjes15 commented 5 years ago

I tried Different optimizer

SGD

RMSprop

Adagrad

Adadelta

Adam

Adamax

I tried these techniques using the original consensus network. num=1000 # batch size n_crop= 380 # image size

Layers

k1_size = 15 k2_size = 15 pool1 = 7 k3_size = 7 k4_size = 7 pool2 = 5 k5_size = 3 k6_size = 3 pool3 = 3 k7_size = 3 k8_size = 3 av_pool = 4 val_dense = 512

Adadelta and SGD have the same result while Adam is the faster optimizer. All of them after three epochs produce a categorical accuracy of 1 and the time is around 60 second each epoch.

Davjes15 commented 5 years ago

@ David try different parameters for SGD and Adam

Davjes15 commented 5 years ago

I tried different parameters using ADAM optimizer in Keras.

adamax = optimizers.Adamax(lr=0.002, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0) lr is the learning rate. beta_1: floats, 0 < beta < 1. Generally close to 1. beta_2: floats, 0 < beta < 1. Generally close to 1. epsilon: float >= 0. Fuzz factor. If None, defaults to K.epsilon(). decay: float >= 0. Learning rate decay over each update.

I meanly change the lr since Adam optimizer recommends to keep constant the other values. The result is similar I could not notice changes in time or accuracy since the model performance at 100 % accuracy after the third epoch