ThanatosShinji / onnx-tool

A parser, editor and profiler tool for ONNX models.
https://pypi.org/project/onnx-tool/
MIT License
383 stars 51 forks source link

Dynamic shape support #40

Closed ZhangGe6 closed 1 year ago

ZhangGe6 commented 1 year ago

Thanks for the nice work 👍. It seems that working on a ONNX model with a dynamic shape is not supported now. For example, if I download resnet18-v1-7.onnx and run

import onnx_tool
modelpath = 'resnet18-v1-7.onnx'
onnx_tool.model_profile(modelpath) # pass file name

Then I got runtime error

Traceback (most recent call last):
  File "dyn_test.py", line 3, in <module>
    onnx_tool.model_profile(modelpath) # pass file name
  File "Python\lib\site-packages\onnx_tool\__init__.py", line 90, in model_profile
    g.shape_infer(dynamic_shapes)
  File Python\lib\site-packages\onnx_tool\graph.py", line 735, in shape_infer
    raise ValueError(
ValueError: The input tensor data's shape [N,3,224,224] is not valid, Please set it to a valid shape.

I looked into the onnx-tool source code and found

def check_inputs(self):
  for name in self.input:
    shape = self.tensormap[name].shape
    for val in shape:
      if isinstance(val, str):
        return False, name
      if val < 0:
        return False, name
  return True, None

As ONNX uses string or -1 to denote the dynamic shape dimension, it seems that dynamic shape is not supported in onnx-tool. Or do I miss something?

ThanatosShinji commented 1 year ago

hi, each profiling report is based on a specific input shape. it varies from different input shapes. In your case, you should pass the dynamic_shapes parameter like: onnx_tool.model_profile(modelpath, dynamic_shapes={'data':numpy.zeros((1,3,224,224))} )

ZhangGe6 commented 1 year ago

@ThanatosShinji Thanks for clarifying. Passing parameter dynamic_shapes={'data':numpy.zeros((1,3,224,224))} works for model profiling. I also tried shape inference with onnx-tool, using the code:

import onnx_tool
import numpy as np

modelpath = 'resnet18-v1-7.onnx'
onnx_tool.model_shape_infer(
    modelpath,
    dynamic_shapes={'data': np.zeros((1,3,224,224))},
    saveshapesmodel="res_model.onnx",
    verbose=True
)

It works well and I can get the res_model.onnx with fixed batch size 1 (rather than the "dynamic" N in the original model). However, is there a way to keep the N dimension in the result model, like primitive ONNX API? that is to say, if I run the following code:

import onnx

model_path = 'resnet18-v1-7.onnx'
model_proto = onnx.load(model_path)
infered_shape_info = onnx.shape_inference.infer_shapes(model_proto)
for value_info in infered_shape_info.graph.value_info:
    model_proto.graph.value_info.append(value_info)

onnx.save(model_proto, "res_model_onnx_api.onnx")

I can get the "dynamic shape" model with N dimension kept.

N_kept_infer
ThanatosShinji commented 1 year ago

Your case is too simple, with batch only. As batch is a constant value for all tensor shapes. Think about it as a complete solution, you need to consider the dynamic shape of other dims like c, h, and w. These values are dynamic variables for different tensors. E.g. tensor1: (batch, C/3, h/2, w/2), even more complicated, (batch, c/3+16, h/2+1, w/2+1). What do you do with these shapes?

BTW, with python class API, onnx_tool.Graph.tensormap, you can set tensor's shape with a string list like [N, 3, 224,224] to meet your need.

ThanatosShinji commented 1 year ago

Please refer the sample code of profile, you can use these code to update each tensor shape's batch to a string value:

    for key in m.graph.tensormap.keys():
        tensor=m.graph.tensormap[key]
        shape=tensor.get_shape()
        shape[0]='N'
        tensor.update_shape(shape)
    m.save_model('resnet50_batch.onnx')  
ZhangGe6 commented 1 year ago

Got it. Thanks.