yujin6056 / yolactedge-onnx-conversion

5 stars 1 forks source link

Error while conversion #1

Open ToriRori opened 2 years ago

ToriRori commented 2 years ago

Got an error RuntimeError: Tried to trace <__torch__.yolact_edge.yolact.FPN_phase_1 object at 0x5618526e2080> but it is not part of the active trace. Modules that are called during a trace must be registered as submodules of the thing being traced.

frotms commented 2 years ago

same error, too

Baiyuetribe commented 2 years ago

+1, need help . #193

PINTO0309 commented 2 years ago

I have improved the conversion script and committed it to this repository as a reference. Thanks to yujin6056.

anthonygofin commented 1 year ago

Hi @ToriRori. I had the same error and I solved it by removing each "@script_method_wrapper" in yolact.py ( particularly in classes FPN, FPN_phase_1 and FPN_phase_2). I do not know why it works. I am not familiar with scripting/tracing etc.. so I do not know if it has an impact on the resulting ONNX file. Maybe @PINTO0309 has an idea ?

Heiheicxx commented 1 year ago

来信已收到^-^谢谢你的来信,保持联系

PINTO0309 commented 1 year ago

I just don't want to run the jit compile script defined as a decorator. @script_method_wrapper

script_method_wrapper = torch.jit.script_method if use_jit else lambda fn, _rcn=None: fn
le-wei commented 1 year ago

Thank you so much for the work you do. I need to convert pth to a trt model. The used onnx to openvivo is available, but the conversion to trt is wrong. Please guide me. thank you very much. @PINTO0309

Heiheicxx commented 1 year ago

来信已收到^-^谢谢你的来信,保持联系

PINTO0309 commented 1 year ago

There are not many combinations of reasons why things can go wrong. I have received many times the exact same issue in the public repository I created and all I can only get the same answer.

If conversion is not possible, it is a problem with the runtime version. I will tell you in advance that the official Jetson runtime is a very old version and probably will not work properly. If you still want it to work, you will have to build or customize any runtime yourself.

Below is an example of a working environment.

  1. TensorRT version https://github.com/PINTO0309/openvino2tensorflow/blob/63fff406a2bd44258c91521add6efc8acf899808/Dockerfile.base#L117-L143

  2. onnxruntime version https://github.com/PINTO0309/openvino2tensorflow/blob/63fff406a2bd44258c91521add6efc8acf899808/Dockerfile.base#L11

  3. onnx-tensorrt version https://github.com/PINTO0309/openvino2tensorflow/blob/63fff406a2bd44258c91521add6efc8acf899808/Dockerfile.base#L153-L164

le-wei commented 1 year ago

Thank you very much for your suggestion, I will try it according to your suggestion. thanks again. @PINTO0309