MatthieuCourbariaux / BinaryNet

Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1
BSD 3-Clause "New" or "Revised" License
1.04k stars 346 forks source link

How to save convolution weight parameters and batchnorm parameters during training #25

Closed SHuixo closed 6 years ago

SHuixo commented 6 years ago

@MatthieuCourbariaux Hi, MattieuCourbariaux, For how to extract weight parameter weight and batchnorm alpha,beta,gamma parameters from the code, I hope you can give an answer. Thank you.

MatthieuCourbariaux commented 6 years ago

Hi Sun-xiaohui,

Firstly, I would binarize the weights:

params = lasagne.layers.get_all_params(your_convnet)
for param in params:
    # print param.name
    if param.name == "W":
        param.set_value(binary_ops.SignNumpy(param.get_value()))

Then, I would extract the convnet's parameters:

params_values = lasagne.layers.get_all_param_values(your_convnet)

Please note that, to efficiently store those parameters, you would first need to concatenate the binary weights into integer variables.

SHuixo commented 6 years ago

@MatthieuCourbariaux Thanks for your answer.

First of all , based on your code, i made corresponding changes to extract the unbinarized weight parameters for each layer:

  layers = lasagne.layers.get_all_layers(your_convnet)
      for layer in layers:
      params = lasagne.layers.get_all_params(layer)
      for param in params:
        # print param.name
        if param.name == "W":
            param_values=param.get_value()

In the modification of the code, i removed the binary_ops.SignNumpy() method, directly save the data obtained by get_value() method. This data should be unbinarized W data, is this correct ?

In addition, how can we sequentially save the alpha, beta, gamma of BatchNormlayer in this way?

thanks. looking forward to your reply.

MatthieuCourbariaux commented 6 years ago

The get_all_params and get_all_param_values methods return all parameters, including alpha, beta and gamma.

SHuixo commented 6 years ago

@MatthieuCourbariaux Oh I see, thank you so much.