Open rishabh-WIAI opened 5 months ago
Describe the issue:
Environment:
Configuration:
Log message:
AttributeError Traceback (most recent call last) Input In [19], in <cell line: 9>() 22 _, masks = pruner.compress() 23 pruner.unwrap_model() ---> 25 model = ModelSpeedup(model, dummy_input, masks).speedup_model() File ~/miniconda3/envs/optha/lib/python3.9/site-packages/nni/compression/speedup/model_speedup.py:435, in ModelSpeedup.speedup_model(self) 433 self.initialize_update_sparsity() 434 self.update_direct_sparsity() --> 435 self.update_indirect_sparsity() 436 self.logger.info('Resolve the mask conflict after mask propagate...') 437 # fix_mask_conflict(self.masks, self.graph_module, self.dummy_input) File ~/miniconda3/envs/optha/lib/python3.9/site-packages/nni/compression/speedup/model_speedup.py:306, in ModelSpeedup.update_indirect_sparsity(self) 304 for node in reversed(self.graph_module.graph.nodes): 305 node: Node --> 306 self.node_infos[node].mask_updater.indirect_update_process(self, node) 307 sp = f', {sparsity_stats(self.masks.get(node.target, {}))}' if node.op == 'call_module' else '' 308 sp += f', {sparsity_stats({"output mask": self.node_infos[node].output_masks})}' File ~/miniconda3/envs/optha/lib/python3.9/site-packages/nni/compression/speedup/mask_updater.py:229, in LeafModuleMaskUpdater.indirect_update_process(self, model_speedup, node) 227 for k, v in node_info.module.named_parameters(): 228 if isinstance(v, torch.Tensor) and model_speedup.tensor_propagate_check(v) and v.dtype in torch_float_dtype: --> 229 grad_zero = v.grad.data == 0 230 node_info.param_masks[k][grad_zero] = 0 AttributeError: 'NoneType' object has no attribute 'data'
How to reproduce it?:
import torch import torch.nn as nn import torchvision.models as tvmodels from nni.compression.pruning import L1NormPruner from nni.compression.utils import auto_set_denpendency_group_ids from nni.compression.speedup import ModelSpeedup if __name__ == '__main__': model = tvmodels.resnet18() model.fc = nn.Linear(in_features=512, out_features=1, bias=True) config_list = [{ 'op_types': ['Conv2d'], 'sparse_ratio': 0.1 }] dummy_input = torch.rand(1, 3, 224, 224) config_list = auto_set_denpendency_group_ids(model, config_list, dummy_input) pruner = L1NormPruner(model, config_list) _, masks = pruner.compress() pruner.unwrap_model() model = ModelSpeedup(model, dummy_input, masks).speedup_model()
Describe the issue:
Environment:
Configuration:
Log message:
How to reproduce it?: