Closed sherlockhz1415 closed 2 years ago
TL;DR: the ch_axis of ConvTransposeReLU2d cannot be set as 1 during the PTQ, which will cause errors.
ch_axis
During the PTQ, if we choose the per-channel mode, we are supposed to set the ch_axis of ConvTranspose as 1 otherwise we'll face errors.
per-channel
ConvTranspose
According to: https://github.com/ModelTC/MQBench/blob/e2175203c8e62596e66500a720a6cb1d1fc1dacd/mqbench/nn/qat/modules/deconv.py#L21-L23 The ch_axis will be automatically converted to 1 if the input ch_axis is not -1, i.e., if we set ch_axis = 0 in configs, it will still be 1 in ConvTranspose.
ch_axis = 0
While in the __init__ of ConvTransposeReLU2d: https://github.com/ModelTC/MQBench/blob/e2175203c8e62596e66500a720a6cb1d1fc1dacd/mqbench/nn/intrinsic/qat/modules/deconv_fused.py#L372-L386 The ch_axis will be overridden by the input qconfig, the above automatic conversion of ch_axis is not working.
__init__
ConvTransposeReLU2d
qconfig
Please correct me if I'm wrong: shall we consider deleting L385 and L386?
Thanks!
You are correct, just delete deconv_fused.py:385-386 will work. Look forward to your MR!
TL;DR: the
ch_axis
of ConvTransposeReLU2d cannot be set as 1 during the PTQ, which will cause errors.During the PTQ, if we choose the
per-channel
mode, we are supposed to set thech_axis
ofConvTranspose
as 1 otherwise we'll face errors.According to: https://github.com/ModelTC/MQBench/blob/e2175203c8e62596e66500a720a6cb1d1fc1dacd/mqbench/nn/qat/modules/deconv.py#L21-L23 The
ch_axis
will be automatically converted to 1 if the inputch_axis
is not -1, i.e., if we setch_axis = 0
in configs, it will still be 1 in ConvTranspose.While in the
__init__
ofConvTransposeReLU2d
: https://github.com/ModelTC/MQBench/blob/e2175203c8e62596e66500a720a6cb1d1fc1dacd/mqbench/nn/intrinsic/qat/modules/deconv_fused.py#L372-L386 Thech_axis
will be overridden by the inputqconfig
, the above automatic conversion ofch_axis
is not working.Please correct me if I'm wrong: shall we consider deleting L385 and L386?
Thanks!