Open Little0o0 opened 2 years ago
I found model without DataParallel wrapping it will fail to prune. i.e. --load-serialized will disable pruning.
--load-serialized
When I run
python compress_classifier.py -a=resnet20_cifar -p=50 ../../../data/cifar10/ -j=22 --epochs=1 --lr=0.001 --masks-sparsity --compress=../agp-pruning/resnet18.schedule_agp.yaml --load-serialized
The total sparsity will always be 0.00
Total sparsity: 0.00
But if I run the same command line without --load-serialized
python compress_classifier.py -a=resnet20_cifar -p=50 ../../../data/cifar10/ -j=22 --epochs=1 --lr=0.001 --masks-sparsity --compress=../agp-pruning/resnet18.schedule_agp.yaml
The total sparsity will be 1.53 after 1 epoch
Total sparsity: 1.53
I found model = torch.nn.DataParallel(model, device_ids=device_ids) is necessary for pruning but I do not know the reason.
model = torch.nn.DataParallel(model, device_ids=device_ids)
I found model without DataParallel wrapping it will fail to prune. i.e.
--load-serialized
will disable pruning.When I run
The total sparsity will always be 0.00
But if I run the same command line without
--load-serialized
The total sparsity will be 1.53 after 1 epoch