Open tianhualefei opened 9 months ago
mlp层中的 GELU 并没有量化? 看代码中是直接调用的nn.GELU层?
(mlp): Mlp( (fc1): QLinear( in_features=384, out_features=1536, bias=True (quantizer): UniformQuantizer() ) (act): GELU() (qact1): QAct( (quantizer): UniformQuantizer() ) (fc2): QLinear( in_features=1536, out_features=384, bias=True (quantizer): UniformQuantizer() ) (qact2): QAct( (quantizer): UniformQuantizer() ) (drop): Dropout(p=0.0, inplace=False) ) (qact4): QAct( (quantizer): UniformQuantizer() ) )
yes, I think it is "partially quantized ViT".
mlp层中的 GELU 并没有量化? 看代码中是直接调用的nn.GELU层?
(mlp): Mlp( (fc1): QLinear( in_features=384, out_features=1536, bias=True (quantizer): UniformQuantizer() ) (act): GELU() (qact1): QAct( (quantizer): UniformQuantizer() ) (fc2): QLinear( in_features=1536, out_features=384, bias=True (quantizer): UniformQuantizer() ) (qact2): QAct( (quantizer): UniformQuantizer() ) (drop): Dropout(p=0.0, inplace=False) ) (qact4): QAct( (quantizer): UniformQuantizer() ) )