Closed gaqiness closed 6 years ago
Please see this post: https://github.com/yihui-he/channel-pruning/wiki/inference-time-on-GPU
I don't understand . why GPU memory would increased in channel-pruning model. thanks!
Please read 3C part in paper
many thanks
All are welcome to create issues, but please google the problem first, and make sure it has not already been reported.
What steps reproduce the bug?
Hi I test the VGG-16 as follow command caffe test -model channel_pruning_VGG-16_3C4x.prototxt -weights channel_pruning_VGG-16_3C4x.caffemodel -iterations 5000 -gpu 0 compare the performance of original vgg-16, I found that the memory of GPU is increased.
The test result of GPU memory consume in nvidia GTX1080: channel_pruning_VGG-16_3C4x.caffemodel : 1773MB original_VGG-16.caffemodel: 1503MB ps: batch_size=10
why does it increase so much? Is that normal?
What hardware and operating system/distribution are you running?
Operating system: ubuntu16.04 CUDA version: 8.0 CUDNN version: 6.0 openCV version: 2.4.9 BLAS:
Python version: 3.5
If the bug is a crash, provide the backtrace.