Closed SHuixo closed 6 years ago
Hi Sun-xiaohui,
Firstly, I would binarize the weights:
params = lasagne.layers.get_all_params(your_convnet)
for param in params:
# print param.name
if param.name == "W":
param.set_value(binary_ops.SignNumpy(param.get_value()))
Then, I would extract the convnet's parameters:
params_values = lasagne.layers.get_all_param_values(your_convnet)
Please note that, to efficiently store those parameters, you would first need to concatenate the binary weights into integer variables.
@MatthieuCourbariaux Thanks for your answer.
First of all , based on your code, i made corresponding changes to extract the unbinarized weight parameters for each layer:
layers = lasagne.layers.get_all_layers(your_convnet)
for layer in layers:
params = lasagne.layers.get_all_params(layer)
for param in params:
# print param.name
if param.name == "W":
param_values=param.get_value()
In the modification of the code, i removed the binary_ops.SignNumpy() method, directly save the data obtained by get_value() method. This data should be unbinarized W data, is this correct ?
In addition, how can we sequentially save the alpha, beta, gamma
of BatchNormlayer in this way?
thanks. looking forward to your reply.
The get_all_params and get_all_param_values methods return all parameters, including alpha, beta and gamma.
@MatthieuCourbariaux Oh I see, thank you so much.
@MatthieuCourbariaux Hi, MattieuCourbariaux, For how to extract weight parameter weight and batchnorm alpha,beta,gamma parameters from the code, I hope you can give an answer. Thank you.