microsoft / nni

An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
https://nni.readthedocs.io
MIT License
14.06k stars 1.82k forks source link

AttributeError: '_output_randomize' not found #5631

Open Xue-JW opened 1 year ago

Xue-JW commented 1 year ago

Describe the bug:

When I try to execute:

ModelSpeedup(model, torch.rand(8, 3, 512, 512).to(device), masks).speedup_model()

I encountered an error. The error message is as follows:

 File "/home/gavinx/Downloads/YOLOv5-Lite/test_nni_pruning2.py", line 143, in prune
    ModelSpeedup(model, torch.rand(1, 3, 512, 512).to(device), masks).speedup_model()
  File "/home/gavinx/Downloads/nni/nni/compression/pytorch/speedup/v2/model_speedup.py", line 433, in speedup_model
    self.update_direct_sparsity()
  File "/home/gavinx/Downloads/nni/nni/compression/pytorch/speedup/v2/model_speedup.py", line 286, in update_direct_sparsity
    self.node_infos[node].mask_updater.direct_update_process(self, node)
  File "/home/gavinx/Downloads/nni/nni/compression/pytorch/speedup/v2/mask_updater.py", line 261, in direct_update_process
    del model_speedup.node_infos[to_delete]._output_randomize
AttributeError: _output_randomize
Exception has occurred: TypeError
unsupported operand type(s) for *: 'NoneType' and 'Tensor'
  File "/home/gavinx/Downloads/nni/nni/compression/pytorch/speedup/v2/mask_updater.py", line 401, in <lambda>
    input_grad = tree_map_zip(lambda t, m: (t * m).type_as(t) if isinstance(m, torch.Tensor) else t, \
  File "/home/gavinx/Downloads/nni/nni/compression/pytorch/speedup/v2/utils.py", line 82, in <listcomp>
    return tree_unflatten([fn(*args) for args in zip(*flat_args_list)], spec_list[0])
  File "/home/gavinx/Downloads/nni/nni/compression/pytorch/speedup/v2/utils.py", line 82, in tree_map_zip
    return tree_unflatten([fn(*args) for args in zip(*flat_args_list)], spec_list[0])
  File "/home/gavinx/Downloads/nni/nni/compression/pytorch/speedup/v2/mask_updater.py", line 401, in indirect_getitem
    input_grad = tree_map_zip(lambda t, m: (t * m).type_as(t) if isinstance(m, torch.Tensor) else t, \
  File "/home/gavinx/Downloads/nni/nni/compression/pytorch/speedup/v2/mask_updater.py", line 463, in indirect_update_process
    indirect_fn(model_speedup, node)
  File "/home/gavinx/Downloads/nni/nni/compression/pytorch/speedup/v2/model_speedup.py", line 305, in update_indirect_sparsity
    self.node_infos[node].mask_updater.indirect_update_process(self, node)
  File "/home/gavinx/Downloads/nni/nni/compression/pytorch/speedup/v2/model_speedup.py", line 434, in speedup_model
    self.update_indirect_sparsity()
  File "/home/gavinx/Downloads/YOLOv5-Lite/test_nni_pruning2.py", line 140, in prune
    ModelSpeedup(model, torch.rand(8, 3, 512, 512).to(device), masks).speedup_model()
  File "/home/gavinx/Downloads/YOLOv5-Lite/test_nni_pruning2.py", line 170, in <module>
    prune(opt)

Environment:

Reproduce the problem

I found possible solution to address this issue: change this line from

del model_speedup.node_infos[to_delete]._output_randomize

to

del model_speedup.node_infos[to_delete].output_randomize

and this line from

input_grad = tree_map_zip(lambda t, m: (t * m).type_as(t) if isinstance(m, torch.Tensor) else t, \

to

input_grad = tree_map_zip(lambda t, m: (t * m).type_as(t) if isinstance(m, torch.Tensor)  and t is not None else t, \
lminer commented 1 year ago

I'm getting this error as well