linghu8812 / tensorrt_inference

705 stars 207 forks source link

How to export nanodet onnx model with softmax and concat? #47

Closed stuarteiffert closed 3 years ago

stuarteiffert commented 3 years ago

Hi, I can't work out where the onnx model is edited during conversion as described in the comment made here https://github.com/RangiLyu/nanodet/issues/65

After running python tools/export_onnx.py my model still has 6 outputs instead of 1 concatenated output

image

stuarteiffert commented 3 years ago

Issue solved by ensuring that the edited nanodet is setup by running python setup.py develop in https://github.com/linghu8812/nanodet before running export_onnx.py

yueyihua commented 3 years ago

@stuarteiffert @linghu8812 I have the same problem, I have run python setup.py develop in https://github.com/linghu8812/nanodet, but it's not useful.

yueyihua commented 3 years ago

The model output is like this: image

linghu8812 commented 3 years ago

@yueyihua check these lines output https://github.com/linghu8812/nanodet/blob/266321f33a39a2154fa5054f60fbb0acd983b906/nanodet/model/head/nanodet_head.py#L136-L141

yueyihua commented 3 years ago

@linghu8812 I used the newest pretrained model nanodet-m.ckpt from https://github.com/RangiLyu/nanodet.git, nanodet version is 0.3.0, and in your project https://github.com/linghu8812/nanodet.git, nanodet version is 0.1.0; if I use your code to setup, will cause an error: NotImplementedError. I modify the code in https://github.com/RangiLyu/nanodet.git, it is not useful. What should I do, using your code to training?