microsoft / nni

An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
https://nni.readthedocs.io
MIT License
14k stars 1.81k forks source link

Large performance drop after applying pruning + speedup #5664

Open CatarinaGouveia opened 1 year ago

CatarinaGouveia commented 1 year ago

Describe the issue: Hello, I applied the L1NormPruner as in the tutorial (to all Conv2d layers in my model) but the accuracy drops from ~50 (mIOU) to 0 and it doesn't seem to be improving much in the retrain after. I have tried a total_sparsity of 0.9, 0.8, 0.7, 0.6 and 0.5 and it happens in all cases. Pruning was supposed to slightly decrease the accuracy but not so drastically, right? Do you have any guesses on why this is happening?

Code that I'm using:


    'op_types': ['Conv2d'],
    'total_sparsity': 0.9,
}]

pruner = L1NormPruner(model, config_list)

# compress the model and generate the masks
_, masks = pruner.compress()

# need to unwrap the model, if the model is wrapped before speedup
pruner._unwrap_model()

ModelSpeedup(model, torch.rand(1, 3, 480, 640).to(device), masks).speedup_model()```