Closed shisheng111 closed 2 months ago
loss function:MISH code is
class Mish(torch.nn.Module):
def __init__(self):
super().__init__()
def forward(self, x):
x = x * torch.tanh(F.softplus(x))
return x
Filed internal bug 4657387 to track this, thanks for reporting this :-)
@shisheng111 Could you try a workaround by inserting Q/DQ ops between the BatchNorm and the activation functions?
We will add a proper fix in TRT in future versions.
Fixed in 10.2. closed
I seemed to have a bug when using tensorrt10.0.1.6. When converting to tensorrt model in the last step, I could not find the quantizer node, is it because I used a custom nonlinear loss function:MISH? Here is my code and model。
this is onnxmodel mobilenetv2_SQ.zip