Closed lovepan1 closed 4 years ago
yes. you can inherit the default upsampling layer, add quantizing handle in init(), quantize input data and weight in forward(). conv2d example: https://github.com/aovoc/nnieqat-pytorch/blob/a610a934d69be566b83c536c950b307fbf16193a/nnieqat/modules/conv.py#L303
ok, thanks your work, i wil try.
cnn quantzation can be devided into weight quantization and acitvation quantization, upsampling layer does not have weight parameter. support weight and acitvation quantization of all layers now.
all = [ 'Linear', 'Bilinear', 'Conv1d', 'Conv2d', 'Conv3d', 'ConvTranspose1d', 'ConvTranspose2d', 'ConvTranspose3d', 'AvgPool1d', 'AvgPool2d', 'AvgPool3d', 'MaxPool1d', 'MaxPool2d', 'MaxPool3d', 'MaxUnpool1d', 'MaxUnpool2d', 'MaxUnpool3d', 'FractionalMaxPool2d', 'LPPool1d', 'LPPool2d', 'AdaptiveMaxPool1d', 'AdaptiveMaxPool2d', 'AdaptiveMaxPool3d', 'AdaptiveAvgPool1d', 'AdaptiveAvgPool2d', 'AdaptiveAvgPool3d' ] all layer type in nnieqat/modules/init.py ? if i have any other unsupported layer, how to use the code ??