Open jimmyroyer opened 8 years ago
You can access parameters at low-level using get_parameters()
and set_parameters()
. It's Lasagne default, but it doesn't matter too much if you use batch normalization. There's no way to customize them except get/set.
Am I right that I can access those functions through the _backend of the classifier? Thank you very much
Also, when I try to use the option normalize="batch" I get the following error message. Any help will be very useful. Thanks 1
Traceback (most recent call last):
File "
You need the latest Lasagne from requirements.txt.
Thank you very much. normalize="batch" indeed works with the latest lasagne. I have a followup question however regarding get_parameters() and set_parameters(). I have the following error when I call myclassifier.get_parameters(). When I look at the structure of _mlp_to_array() I can't figure out the weights and biases. Thanks again for your help
Traceback (most recent call last):
File "
Batch normalization seems to break the set/get because it requires many values not just bias/weight.
It's a bug ;-)
Thanks for your super quick responses. Can we access the final weights/biases somewhere else?
Call nn._backend._mlp_to_array()
directly, should be a long list of vectors.
Thanks again. Great package btw.
Hi, is it possible to change the initial values for the weights and biases? I think they are set in Lasagne but I don't know which initial values are used and if I can change them using "Classifier".
Thank you very much