sdemyanov / ConvNet

Convolutional Neural Networks for Matlab for classification and segmentation, including Invariang Backpropagation (IBP) and Adversarial Training (AT) algorithms. Trained on GPU, require cuDNN v5.
240 stars 141 forks source link

Exporting learned weights #22

Closed diegostefano closed 9 years ago

diegostefano commented 9 years ago

Hi! Congratulations on this great work, and thanks for sharing.

I'm trying to use the weights generated by the training procedure but I could not figure out how they are organized inside that huge column vector. By inspecting the "getweights.m" file I saw that the weights comes first, followed by the biases; but are the convolutional weights in the row-major or in the column-major layout (I'm supposing that the filter layers are disposed sequentially in the vector)? And in the feedforward layers, are the weights from one given neuron consecutive in the vector?

Thanks a lot!

sdemyanov commented 9 years ago

Hi, Diego,

Thanks for using it. In the Matlab variable layer{i}.k the weights of convolutional layers are stored in memory in a column-major order. The 3rd dimension is the number of maps on the previous layer, the 4th dimension is the number of maps on the current layer. BUT, when I compose this long vector, the forth dimension becomes the first (this command k_trans = permute(layers{l}.k, [4 1 2 3]);). This is how the memory should be stored in Alex's convolution functions.

The weighs of fully connected layers in Matlab are also in column-major order. The number of rows is the number of neurons in the current layer, the number of columns is the number of neurons in the previous layer. In the long vector they are stored the same way.

In other words, the consecutive weights correspond to consecutive neurons (or maps) in the current layer. You can also use the function 'setweighs' to split the vector through arrays of the proper size of the structure 'layers'.

diegostefano commented 9 years ago

Thanks a lot, Sergey! I used your last tip about using the "setweights" function and it worked: I got the weights in its original shape!

Thank you again!