wenwei202 / caffe

Caffe for Sparse and Low-rank Deep Neural Networks
Other
376 stars 134 forks source link

nn_decomposer does not support kernel_h and kernel_w #14

Open hahne opened 6 years ago

hahne commented 6 years ago

nn_decomposer.py expects cur_layer_param.convolution_param.kernel_size._values to be set. This means the convolutional kernel needs to be square and kernel_h and kernel_w may not be different. Is there any special reason for this?

wenwei202 commented 6 years ago

@hahne There is no constraint on the kernel size from the algorithm perspective. It works in general (see here).

The code expects those, only because I did a lazy job to escape from a cumbersome code block of if-else-if-else to check that the kernel_size exists in kernel_size or kernel_h kernel_w . If it exists in kernel_size, we also had to check it was a list or a scalar to make it work in general.

hahne commented 6 years ago

@wenwei202 ok understand. Thank you for the info!

wenwei202 commented 6 years ago

@hahne I suspect those assert is not necessary for linear combination layer, but we need to delete kernel_h and kernel_w if they are copied from decomposed conv layer. Similar thing for stride, pad and dilation.