ruihangdu / Decompose-CNN

CP and Tucker decomposition for Convolutional Neural Networks
75 stars 21 forks source link
alexnet compression convolutional-neural-networks decomposition deep-learning fine-tuning resnet-50 training tucker

Goal

The goal of this program is to decompose each convolutional layers in a model to reduce the total number of floating-point operations (I'll use the shorthand flops) in the convolutions as well as the number of parameters in the model.

Contributions

This is an extension of the work https://github.com/jacobgil/pytorch-tensor-decompositions. In this implementation, everything, including finding the ranks and the actual CP/Tucker Decomposition, is done in PyTorch without switching to numpy.

CNN architecture decomposed

Dataset used

Usage

python3 scripts/decomp.py [-p PATH] [-d DECOMPTYPE] [-m MODEL] [-r CHECKPOINT] [-s STATEDICT] [-v]

Pre-decomposed and fine-tuned model

A pre-decomposed ResNet50 is included in the models directory as resnet50_tucker.pth.

The fine-tuned parameters for the model is the resnet50_tucker_state.pth in the models directory.

Results

It turn out that Tucker decomposition yields lower accuracy loss than CP decomposition in my experiments, so the results below are all from Tucker decomposition.

AlexNet

Top-1 Top-5 flops in convolutions (Giga)
Before 56.55% 79.09% 1.31
After 54.90% 77.90% 0.45

ResNet50

Top-1 Top-5 flops in convolutions (Giga)
Before 76.15% 92.87% 7.0
After 74.88% 92.39% 4.7

References

Any comments, thoughts, and improvements are appreciated