Open chrisflip opened 1 year ago
Hi @chrisflip ,
Thanks for your question and sorry for the late reply.
I see two options:
n_neurons1
, n_neurons2
, etc.n_neurons
that contains a list of number of neurons (e.g., [100, 50, 10]
) and use a custom function in the param_distribs
dictionary to sample for this multi-dimensional space.That said, I don't think it's necessary. People used to do this, but it would complicate things, and in practice it didn't really help. Using the same number of neurons at each layer usually works fine. There's essentially one exception: you may want a bottleneck layer in the middle, like in autoencoders, but this only requires one additional parameter.
Hope this helps.
Hi, first of all, thanks for this amazing book. I have a question regarding chapter 10, hyperparameter tuning with keras and sklearn: the model allows for multiple hidden layers. However, I believe that
n_neurons
is fixed across all hidden layers. How can I make the model more flexible so thatn_neurons
can change with every layer?Best, Chris