Xilinx / Vitis-AI

Vitis AI is Xilinx’s development stack for AI inference on Xilinx hardware platforms, including both edge devices and Alveo cards.
https://www.xilinx.com/ai
Apache License 2.0
1.44k stars 626 forks source link

BatchNorm1d problem #555

Closed qw85639229 closed 2 years ago

qw85639229 commented 2 years ago

When I try to quantize my model, I get this error. Is BatchNorm1d not supported in this platform?


[VAIQ_NOTE]: =>Exporting quant config.(.qat/quant_info.json)
W1014 09:35:15.717100 quant_aware_training.py] Reused module (quant_stub) may lead to poor result of QAT, make sure this is what you expect.
Traceback (most recent call last):
  File "test_cluster.py", line 220, in <module>
    main()
  File "test_cluster.py", line 62, in main
    main_worker(args)
  File "test_cluster.py", line 114, in main_worker
    quantized_model = qat_processor.trainable_model(allow_reused_module=True)
  File "/opt/vitis_ai/conda/envs/vitis-ai-pytorch/lib/python3.6/site-packages/pytorch_nndct/quantization/quant_aware_training.py", line 314, in trainable_model
    self._insert_quantizer(model_topo, allow_reused_module)
  File "/opt/vitis_ai/conda/envs/vitis-ai-pytorch/lib/python3.6/site-packages/pytorch_nndct/quantization/quant_aware_training.py", line 489, in _insert_quantizer
    'yet. (Node name: {})').format(type(node.module), node.name))
NotImplementedError: The quantization of <class 'torch.nn.modules.batchnorm.BatchNorm1d'> not implemented yet. (Node name: feat_bn_max)
``` @
yuwang-xlnx commented 2 years ago

Hi @qw85639229,

Do you want to run post-training quantization(PTQ) or quant-aware training(QAT) ?

From the error message, you are peforming QAT on the model, please make sure this is what you expected. And currently BatchNorm1d is not supported for QAT.