onnx / optimizer

Actively maintained ONNX Optimizer
Apache License 2.0
647 stars 90 forks source link

If model use initializer, fuse_bn_into_conv is broken from v0.3.9 #133

Open sa-kei728 opened 1 year ago

sa-kei728 commented 1 year ago

I tried to optimize model by using "fuse_bn_into_conv", generate optimized model incorrectly from v0.3.9. I would say 807cff7 is the cause. this commit would work fine with constant node, but the initializer shouldn't work... Why were these changes made? I'd like to support for initializer as well since onnxoptimizer has "extract_constant_to_initializer" which I often use to avoid errors related to topological sort. At least I wouldn't like this change.

Confirmed Environment(part of pyproject.toml)

[tool.poetry.dependencies]
python = "3.11"
onnx = "1.13.1"
onnxoptimizer = "0.3.9"

Reproduce Code

  1. Download tiny-yolov3.onnx (this model is used initializer) https://github.com/onnx/models/blob/main/vision/object_detection_segmentation/tiny-yolov3/model/tiny-yolov3-11.onnx

  2. prepare onnx and onnxoptimizer

  3. create reproduce code

    
    import onnx
    from onnxoptimizer import optimize

load model

model_path = "tiny-yolov3-11.onnx" model = onnx.load(model_path)

setting optimize passes

optimization_passes = ["fuse_bn_into_conv"]

execute optimization

optimized_model = optimize(model, optimization_passes)

save model

optimized_model_path = "tiny-yolov3-11-optimized.onnx" onnx.save(optimized_model, optimized_model_path)



### v0.3.9
![v0_3_9](https://user-images.githubusercontent.com/46880264/232750016-58b0d473-7a85-4e08-a43b-b3c18971019b.png)

###  in case of v0.3.8
![v0_3_8](https://user-images.githubusercontent.com/46880264/232749955-1c2f8362-1e37-4961-8ed3-7cc317e89249.png)
daquexian commented 1 year ago

Thanks for your issue! We'll consider how to fix it. Could you please try using onnxsim for now?

sa-kei728 commented 1 year ago

Thank you for quickly response! I tried onnxsim for original and optimized model. The correct fusion model is now created!

sa-kei728 commented 1 year ago

I found another case. If Conv has "bias" term, It seems that fuse_bn_into_conv fails from v0.3.9. This case is unresolved by using onnxsim.

In general there is no such model, but the tool I was using seems to generate an onnx model that temporarily includes the bias term in the conv and then runs fuse_bn_into_conv, which fails in the optimizer after v0.3.9. (I'd like to share this model and tool with you, but I cannot do so for some reasons. Sorry...)

There is a work-around for this case, it's no problem for now. I comment to share the information.

AndreyOrb commented 10 months ago

@sa-kei728 What is the workaround you used?

sa-kei728 commented 10 months ago

@AndreyOrb In my case, I used v0.3.8.

I would say onnx2json is also resolved this issue. You may resolve to remove bias term in the target Conv Node and append Add Node after Conv. (However I haven't tried it, so can't guarantee it works.)