Open ljakupi opened 5 years ago
Hello, I have the same issue as you. I added momentum as optim hyperparameter. Momentum is not included into the hyperparameter results . Additionally, i don’t really know hop the other hyperparameter such as Dropout_1, Dropout_2, ... I didn’t request them. Thank you if you have any advice
Tbh I switched to another approach that supports Bayesian optimization since I found difficulties adapting hyperas to my project. In hyperas if you give uniform values to parameters, as a result, you get the exact value, while if you use choices, then the result is the index of the value given to that parameter as a list (hence choice/categorical). But, probably one should read more about hyperopt, since hyperas is just a wrapper of it. To your case, have no idea why it is providing you with dropout results, while you are not tuning dropout!
Hello, Thanks for your feedback. Which approach did you used ? Thanks
Le 2 juin 2019 à 11:21, Labinot Jakupi notifications@github.com a écrit :
Tbh I switched to another approach that supports Bayesian optimization since I found difficulties adapting hyperas to my project. In hyperas if you give uniform values to parameters, as a result, you get the exact value, while if you use choices, then the result is the index of the value given to that parameter as a list (hence choice/categorical). But, probably one should read more about hyperopt, since hyperas is just a wrapper of it. To your case, have no idea why it is providing you with dropout results, while you are not tuning dropout!
— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.
I find a way out. By using
num_layers = {{choice(['two', 'three'])}}
if num_layers == 'three':
The results will give 'num_layers'
in the best_run
dictionary
@ljakupi can you maybe share your code how you've done the optimization of layers and neurons with hyperopt?
@happyangry if I understand you correctly you want to start with a 1-layer network as a base and then depending on the choice of 'num_layers' add the rest of the layers as some kind of case statements?
@cklat I want to know 2-layer and 3-layer network which one is better. In @ljakupi 's code he uses
if {{choice(['two', 'three'])}} == 'three':
model.add(tf.keras.layers.Dense(
{{choice([32, 64, 128])}},
kernel_regularizer=tf.keras.regularizers.l2({{choice([0.01, 0.05, 0.1])}})
))
model.add(tf.keras.layers.Activation('relu'))
# dropout
model.add(tf.keras.layers.Dropout({{uniform(0, 1)}}))
but the results didn't directly indicate wheter the 3rd layer should be added:
'Dense': 2,
'Dense_1': 0
'Dropout': 0.06536980304050743,
'Dropout_1': 1,
'Dropout_2': 0.11245974166556161,
'chooseOptimiser': 2,
'epochs': 1,
'kernel_initializer': 1,
'l2': 1,
'l2_1': 2,
'lr': 0,
'lr_1': 1,
'lr_2': 2
Although we could check the model structure by model.summary()
.
After declaring num_layers = {{choice(['two', 'three'])}}
, the results will contain this:
'num_layers': 'three',
Above implementation gives me these params:
How can I know if a third Dense layer should be added (or not) to get the best architecture?