Closed Jonty800 closed 5 years ago
I don't see any reason why you can't do this. Talos API is just a workflow layer that can be added on top of Keras API i.e. generally speaking Talos does not support specific features of Keras API, but Keras API.
Closing here. Feel free to open new issue if anything else.
Thanks for your help.
I was able to achieve this using a list of strings like so:
x = Conv1D(filters, 3, padding = 'same', name = 'enc1')(inputTensor1)
if params['activation'] == 'leakyrelu':
x = keras.layers.LeakyReLU(alpha = 0.1)(x)
elif params['activation'] == 'prelu':
x = keras.layers.PReLU(alpha_initializer = 'zeros', alpha_regularizer = None, alpha_constraint = None, shared_axes = None)(x)
else:
x = Activation(params['activation'])(x)
This only works for custom layers already built into keras. If you want to use cuttom layers from e.g. keras_contrib, then you're out of luck as soon as you want to deploy or fetch the best model. Unfortunately, I discovered this issue after running five days worth of hyperparameter optimization.
then you're out of luck as soon as you want to deploy or fetch the best model. Unfortunately, I discovered this issue after running five days worth of hyperparameter optimization.
@bjtho08 Can you explain what do you mean by this? This should be really straightforward to resolve. Better open a new ticket though.
I will do it within the next hour or so. I'm on my phone right now, but as soon as I get to my office, I can sit down and make a ticket.
From your guides, I can see that we can use activation layers from the functional api as such:
x = Conv1D(params['filters'], 3, padding='same',activation=params['activation'])(x)
Can we also use hyperparameter optimisation with LeakyRelu or PRelu layers like so:
Can I specify that I want this advanced activation layer to be either PRelu, LeakyRelu, Relu, Elu?
Thanks for any help