Open jimmyroyer opened 8 years ago
I presume you tried seed
as constructor argument and it's not deterministic?
Seed works but it seeds the whole function. For instance if I use a randomized search for the meta parameters it will seed the searched parameters as well as the dropouts and the weights. I was wondering if it is possible to only seed the random dropout part of the classifier.
No, it's not currently possible. Could you share why you need dropout-only seed?
I want to use the seed of the weights and the randomized search as another "hyper" parameter.
I'm not sure I would support dropout as separate seed, it shouldn't make any difference as the iterations increase. More important is to expose the weight initialization strategy!
Hello, is it possible to seed (initialize) the random dropout regularization in the MLP classifier? Thanks