ethanhe42 / channel-pruning

Channel Pruning for Accelerating Very Deep Neural Networks (ICCV'17)
https://arxiv.org/abs/1707.06168
MIT License
1.07k stars 310 forks source link

What is the sparsity of VGG-16 ? #79

Closed s0606757 closed 6 years ago

s0606757 commented 6 years ago

Hi~

May I ask a question about the sparsity of VGG-16 (version : channel pruning , speedup : 5x)

could you list it to me ?

Thank you so much for helping me^_^

ethanhe42 commented 6 years ago

refer to the prototxt https://github.com/yihui-he/channel-pruning/blob/master/temp/channel_pruning.prototxt

s0606757 commented 6 years ago

Sorry bother you again^^" I don't understand it fully. let me take an example for my question.

For example: VGG-16 , speed:5x layer { name: "conv1_1" type: "Convolution" bottom: "data" top: "conv1_1" convolution_param { num_output: 24 pad: 1 kernel_size: 3 } }

1.The size of output activations is 224x224x24 after conv1_1? 2.The elements in the output activation volume must be all non-zero?

ethanhe42 commented 6 years ago

yes

s0606757 commented 6 years ago

I appreciated your time and effort for answering my question ^_^

Actually, my research is on CNN accelerator for supporting sparse CNN. Therefore, I am surveying the sparse model such as alexnet, VGG, and Resnet. Do you know which paper also released their pruned VGG or sparse VGG model on Internet like you?

lastly, words are not enough to express my gratitude.Thank you so much!!!!