Open aleSuglia opened 1 year ago
for me opset_version=11 and with torch.no_grad(): worked ! with torch.no_grad():
torch.onnx.export(
model,
img_input,
onnx_model_path, # where should it be saved
verbose=False,
export_params=True,
do_constant_folding=False, # fold constant values for optimization
# do_constant_folding=True, # fold constant values for optimization
input_names=['input'],
output_names=['output'],
opset_version=11
)
but i cannot reload the onnx model .. :(
Hello there,
Thank you so much for this great repository. I've been using VinVL for a while now and I'm really pleased with its accuracy. However, considering its size, I was wondering whether you had any plans to support ONNX to speed up the inference process. I have tried myself to enable it but I got some very strange errors while completing some operations.
Do you have any advise?