Open kkeono2 opened 11 months ago
you can look at this discussion -> https://github.com/facebookresearch/Detic/issues/113#issue-1997887378
Thanky you for rapid answering @codet-ovd
I used detectron2's built-in script (i just change set-up configuration in export_onnx.py ) to convert torch model to onnx and i have successfully converted.
However, i faced the problem with topk initialization when i converted onnx to tensorrt as below(TensorRT error message).
'''
[6] Invalid Node - /proposal_generator/TopK
This version of TensorRT only supports input K as an initializer
'''
I guess the problem with topk initialization due to varing the image_thresh per image in centernet.py. Therefore, i fixed the image_thresh as constant value(0.1).
if num_dets > post_nms_topk:
cls_scores = result.scores
# image_thresh, _ = torch.kthvalue(cls_scores.float().cpu(),num_dets - post_nms_topk + 1)
keep = cls_scores >= 0.1 #image_thresh.item()
I also faced the error as below. i don't know why these error is occuring. ''' [6] Invalid Node - /roi_heads/box_pooler/level_poolers.0/If /roi_heads/box_pooler/level_poolers.0/If_OutputLayer: IIfConditionalOutputLayer inputs must have the same shape. '''
Can you help me solve this please? I use TensorRT 8.5.1, ONNX 1.12.0
Thank you for answering @gigasurgeon
I can convert torch model to onnx without comment the nms_and_topk line in centernet, while exporting the model.
Actually, i can't convet onnx to tensorrt as above issue.
Can you give me a suggetion to solve above issue ?
Thank you for your great work.
Do you have any plan to support model conveting implementation for onnx ?