quic / aimet

AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
https://quic.github.io/aimet-pages/index.html
Other
2.09k stars 375 forks source link

how to ignore gru layer for Adaround.apply_adaround? #2491

Closed bestboybsh closed 11 months ago

bestboybsh commented 11 months ago

Hi, I'm tring to use adaround for the model including several cnn layers and gru layers.

but, the assertion occured with the below message.

AttributeError: 'str' object has no attribute 'enabled'.

When I exclude gru layer in the model, then, it works well. I think the gru layer should be excluded for applying adaround.

Is there any method for doing that?

quic-mangal commented 11 months ago

Hi @bestboybsh, could you tell us which framework you are referring to?

bestboybsh commented 11 months ago

Hi, @quic-mangal , I am using the torch model to quantize.

quic-mangal commented 11 months ago

The GRU layer is already excluded from adaround, we only adaround- AdaroundSupportedModules = (torch.nn.Conv2d, torch.nn.ConvTranspose2d, torch.nn.Linear)

That being said, could you post the full error for better analysis?

bestboybsh commented 11 months ago

I runned the below code

bn_folded_model = copy.deepcopy(model).eval() ada_model = Adaround.apply_adaround(bn_folded_model, dummy_input_total.cuda(), params, path='./', filename_prefix='adaround', default_param_bw=8, default_quant_scheme=QuantScheme.post_training_tf_enhanced)

And, the error message is as follows

2023-10-06 01:24:30,223 - Quant - INFO - No config file provided, defaulting to config file at /root/miniconda3/lib/python3.8/site-packages/aimet_common/quantsim_config/default_config.json
2023-10-06 01:24:30,247 - Quant - INFO - Unsupported op type Squeeze
2023-10-06 01:24:30,248 - Quant - INFO - Unsupported op type Pad
2023-10-06 01:24:30,249 - Quant - INFO - Unsupported op type Mean
2023-10-06 01:24:30,253 - Utils - INFO - ...... subset to store [Conv_86, Relu_88]
2023-10-06 01:24:30,254 - Utils - INFO - ...... subset to store [Conv_90, Relu_92]
2023-10-06 01:24:30,255 - Utils - INFO - ...... subset to store [Conv_70, Relu_72]
2023-10-06 01:24:30,256 - Utils - INFO - ...... subset to store [Conv_74, Relu_76]
2023-10-06 01:24:30,257 - Utils - INFO - ...... subset to store [Conv_78, Relu_80]
2023-10-06 01:24:30,258 - Utils - INFO - ...... subset to store [Conv_82, Relu_84]
2023-10-06 01:24:30,259 - Quant - INFO - Selecting DefaultOpInstanceConfigGenerator to compute the specialized config. hw_version:default
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Cell In[41], line 2
      1 bn_folded_model = copy.deepcopy(model).eval()
----> 2 ada_model = Adaround.apply_adaround(bn_folded_model, dummy_input_total.cuda(), params,
      3                                     path='./', filename_prefix='adaround', default_param_bw=8,
      4                                     default_quant_scheme=QuantScheme.post_training_tf_enhanced)

File ~/miniconda3/lib/python3.8/site-packages/aimet_torch/adaround/adaround_weight.py:155, in Adaround.apply_adaround(cls, model, dummy_input, params, path, filename_prefix, default_param_bw, param_bw_override_list, ignore_quant_ops_list, default_quant_scheme, default_config_file)
    152 # Compute only param encodings
    153 cls._compute_param_encodings(quant_sim)
--> 155 return cls._apply_adaround(quant_sim, model, dummy_input, params, path, filename_prefix)

File ~/miniconda3/lib/python3.8/site-packages/aimet_torch/adaround/adaround_weight.py:180, in Adaround._apply_adaround(cls, quant_sim, model, dummy_input, params, path, filename_prefix)
    178 _, input_quantizers, output_quantizers = utils.get_all_quantizers(quant_sim.model)
    179 for quantizer in itertools.chain(input_quantizers, output_quantizers):
--> 180     assert not quantizer.enabled
    182 # Get the module - activation function pair using ConnectedGraph
    183 module_act_func_pair = connectedgraph_utils.get_module_act_func_pair(model, dummy_input)

AttributeError: 'str' object has no attribute 'enabled'
quic-hitameht commented 11 months ago

@bestboybsh Thanks for reporting this. This bug has been fixed already but isn't part of AIMET 1.28 release.

You can still try updating get_all_quantizers utility from aimet_torch/utils.py file from tip of develop branch.

Let us know if you see any other issues.

quic-hitameht commented 11 months ago

Closing this as resolved. If there is another question please re-open or create a new issue.