aim-uofa / AdelaiDet

AdelaiDet is an open source toolbox for multiple instance-level detection and recognition tasks.
https://git.io/AdelaiDet
Other
3.34k stars 643 forks source link

ABCNet deployment #168

Open shuangyichen opened 4 years ago

shuangyichen commented 4 years ago

Is there anyway to export model to caffe model like using detectron2 tools caffe2_converter.py?

shuangyichen commented 4 years ago

@blueardour Hi,

Can onnx/export_model_to_onnx.py work for ABCNet? Looking forward to your reply. Thank you!

blueardour commented 4 years ago

Thanks for reporting the requirement. I will pay some effort on the converting.

shuangyichen commented 4 years ago

@blueardour HI, I am tring to convert by myself because the assignment is very urgent. And I am following your work. So could plz give me some hints or explain your pipeline about exporting onnx? Thank you!

blueardour commented 4 years ago

Hi, @shuangyichen

For the onnx converting script, one might consider the following aspects:

Example:

OneStageDetector(
  (backbone): FPN(
    (fpn_lateral3): Conv2d(
      512, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
      (norm): NaiveSyncBatchNorm(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (quant_activation): quantization()
      (quant_weight): quantization()
    )
    (fpn_output3): Conv2d(
      256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
      (norm): NaiveSyncBatchNorm(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (quant_activation): quantization()
      (quant_weight): quantization()
    )
    (fpn_lateral4): Conv2d(
      1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
      (norm): NaiveSyncBatchNorm(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (quant_activation): quantization()
      (quant_weight): quantization()
    )
    (fpn_output4): Conv2d(
      256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
      (norm): NaiveSyncBatchNorm(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
blueardour commented 4 years ago

just update some progress.

shuangyichen commented 3 years ago

Thank you for your contribution!

shuangyichen commented 3 years ago

Hello @blueardour , Do you know how to deal with BezierAlign before torch.onnx.export? Looking forward to your reply.

blueardour commented 3 years ago

hi, @shuangyichen

From my perspective, I suggest not exporting the BezierAlign into onnx. Instead, just export other code with normal operators such as conv/fc/bn/relu/interp. When inference on embeded platform or TenorRT, one might re-implement the BezierAlign code on the dedicated platform.

shuangyichen commented 3 years ago

@blueardour So we just use BezierAlign as the boundary to divide the two parts, simply skip BezierAlign. And convert the two parts into onnx respectively?

blueardour commented 3 years ago

Yes. Though it involves with some extra effort, it somehow provides a feasible way.

jdavidd commented 3 years ago

@shuangyichen Hello, did you manage to write the script to conver ABCnet model to onnx?

jdavidd commented 3 years ago

@blueardour Hello, did you worked on the script to convert Abcnet model to onnx?

evanchien commented 2 years ago

@blueardour Were you able to get the conversion work? Would really love to see how good it will work after deployment. Thanks! :)

ucsky commented 2 years ago

Hi @blueardour I'm in the same situation. Do you have some feedback about converting the model to onnx?

Single430 commented 2 years ago

@shuangyichen @blueardour blue Could you tell me what progress abcnet to onnx now?

pustar commented 1 year ago

@shuangyichen @blueardour blue Could you tell me what progress abcnet to onnx now?