daquexian / onnx-simplifier

Simplify your onnx model
Apache License 2.0
3.67k stars 377 forks source link

fix: Preserve empty initializer inputs #305

Closed Xyzhao1999 closed 10 months ago

Xyzhao1999 commented 10 months ago

For some Operators, there may be an empty input initializer. During the constant folding phase, it will be treated as a normal input node. This may break the constraints of some Operators, such as Resize.

Origin ONNX

The scales and sizes of the Resize node only allow one to be set, and the original version may generate such a structure,and the onnxruntime cannot use this model.

onnxsim-old

In [3]: sess = onnxruntime.InferenceSession('onnxsim_bug_fix/resize_561_old.onnx', providers=['CPUExecutionProvider'])
---------------------------------------------------------------------------
Fail                                      Traceback (most recent call last)
Cell In[3], line 1
----> 1 onnxruntime.InferenceSession('onnxsim_bug_fix/resize_561_old.onnx', providers=['CPUExecutionProvider'])

File ~/Environments/miniconda3/envs/default/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:383, in InferenceSession.__init__(self, path_or_bytes, sess_options, providers, provider_options, **kwargs)
    380 disabled_optimizers = kwargs["disabled_optimizers"] if "disabled_optimizers" in kwargs else None
    382 try:
--> 383     self._create_inference_session(providers, provider_options, disabled_optimizers)
    384 except (ValueError, RuntimeError) as e:
    385     if self._enable_fallback:

File ~/Environments/miniconda3/envs/default/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:424, in InferenceSession._create_inference_session(self, providers, provider_options, disabled_optimizers)
    422 session_options = self._sess_options if self._sess_options else C.get_default_session_options()
    423 if self._model_path:
--> 424     sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
    425 else:
    426     sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)

Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from onnxsim_bug_fix/resize_561_old.onnx failed:Node (Resize_561) Op (Resize) [ShapeInferenceError] Either `sizes` or `scales` must be provided, but not both of them

In [4]: 

Special judgment on empty initializer can solve this problem.

onnxsim-new

In [7]: ses = onnxruntime.InferenceSession('onnxsim_bug_fix/resize_561_fix.onnx', providers=['CPUExecutionProvider'])

In [8]: