maikherbig / AIDeveloper

GUI-based software for training, evaluating and applying deep neural nets for image classification
BSD 2-Clause "Simplified" License
111 stars 20 forks source link

Configuring optimizer arguments #7

Closed alfrem closed 4 years ago

alfrem commented 4 years ago

Arguments for each optimizer are different and would be nice to be able to be configured as seen in:

https://keras.io/optimizers/

Learning rate is already in place, would be perfect to have Rho, beta_1, beta_2, amsgrad, momentum and nesterov.

Great job, by the way.

maikherbig commented 4 years ago

Hi, I thought about that and actually included them in an initial design of the UI (see below). But then I decided to make it more "simple" as the default parameters are typically just fine. Do you like the design below? Do you change these other parameters often? MoreOptimizer_Parameters

alfrem commented 4 years ago

Seems perfect and very functional to me :). You are great!

maikherbig commented 4 years ago

Solved via c7689b1 I decided to make the menu less apparent by adding just a little button (...) next to the Optimizer Drop-down menu. The button will invoke a popup-menu looking similar as shown in my comment above. Sorry it took me so long. I focused working on implementing learning rate schedules (cyclical learning rates etc). In my experience, the learning rate is one of the more important hyper parameters.