YuvalNirkin / hyperseg

HyperSeg - Official PyTorch Implementation
https://nirkin.com/hyperseg
Creative Commons Zero v1.0 Universal
209 stars 39 forks source link

Saving for C++ inference #2

Closed Roios closed 1 year ago

Roios commented 3 years ago

First, great work !

I was trying to train in Python and save it for C++ inference. The classic approach doesn't work:

annotation_script_module = torch.jit.script(model)
annotation_script_module.save("my_path")

Do you have any suggestion on how to do it?

YuvalNirkin commented 3 years ago

I haven't tried running the models with the C++ API but this can be interesting, it might improve the runtime performance. Can you elaborate what doesn't work? Are you getting any errors?

Roios commented 3 years ago

So my idea was simply to convert the model using the script option of pytorch as I do for other models. From there, load it in my C++ program and do the inference. From what I was able to check, the network architecture as it is, is not scriptable. I didn't dig enough on it to point exactly to the problem root. I do not believe it would be much faster but would open more doors to be used on other programs.

YuvalNirkin commented 3 years ago

You can try exporting to onnx instead.

YuvalNirkin commented 1 year ago

I am closing this issue, you can reopen it if it's still relevant