Closed kai06046 closed 3 years ago
Indeed, the onnx is upgraded frequently, mxnet can not use the lateast version of ONNX. This may bring about some unkown issues. It seems that you found the problem, so you can do some extra work to fix it.
I also encountrered this problem and wrote my own softmax too as a work around. The ONNX op for softmax is not the same as the mxnet softmax. See http://www.xavierdupre.fr/app/mlprodict/helpsphinx/api/onnxrt_ops.html#softmax
I generated onnx model file for your v2 by to_onnx.py and run it on onnxruntime.
There are 10 groups of output for an inference, which are one confidence score and bbox for 5 scales. Bbox output is exactly same with mxnet inference but not confidence score. To verify it, I modified mxnet symbol to output convolution score before softmax operation. Then I calculate softmax myself and found the result consistant with mxnet inference.
I wonder if this problem could come from onnxruntime/mxnet softmax symbol/onnx convertion? p.s. thank you for this great work!