Samsung / ONE

On-device Neural Engine
Other
429 stars 157 forks source link

[one-cmds] Allow to specialize shape of input/output tensors during ONNX-Circle conversion #13636

Open mbencer opened 2 months ago

mbencer commented 2 months ago

In the current state there is no way to specialize shape of input/output tensors during ONNX->Circle conversion. ONNX library provides update_inputs_outputs_dims which can be used to achieve this feature.

What's more infer_shapes from onnx library can be used for shape inference support (in many cases it can eliminate need of providing output shape).

mbencer commented 2 months ago

TEST MODELS

import os
import onnx
from onnx import helper
from onnx import TensorProto

def create_test_abs_model(input_shape, output_shape, output_path):
    input = helper.make_tensor_value_info('input', TensorProto.FLOAT, input_shape)
    output = helper.make_tensor_value_info('output', TensorProto.FLOAT, output_shape)

    node_def = helper.make_node(
        'Abs',
        inputs=['input'],
        outputs=['output'],
    )

    graph_def = helper.make_graph(
        [node_def],
        'test_model',
        [input],
        [output],
    )

    onnx_model = helper.make_model(graph_def, producer_name='test_model')
    onnx.checker.check_model(onnx_model)
    onnx.save(onnx_model, output_path)

def create_test_reshape(input_shape, output_path):
    input = helper.make_tensor_value_info('input', TensorProto.FLOAT, input_shape)
    shape = helper.make_tensor_value_info('shape', TensorProto.FLOAT, [3])
    output = helper.make_tensor_value_info('output', TensorProto.FLOAT, ['x', 'y', 'z'])

    reshape_node = onnx.helper.make_node(
        'Reshape',
        inputs=['input', 'shape'],
        outputs=['output']
    )

    graph_def = helper.make_graph(
        [reshape_node],
        'test_model',
        [input, shape],
        [output],
    )

    onnx_model = helper.make_model(graph_def, producer_name='test_model', opset_imports=[onnx.helper.make_opsetid("", 10)])
    onnx.checker.check_model(onnx_model)
    onnx.save(onnx_model, output_path)

def create_test_abs_reshape(input_shape, output_path):
    input = helper.make_tensor_value_info('input', TensorProto.FLOAT, input_shape)
    shape = helper.make_tensor_value_info('shape', TensorProto.FLOAT, [3])
    reshape_out = helper.make_tensor_value_info('reshape_out', TensorProto.FLOAT, ['x', 'y', 'z'])

    abs_node = helper.make_node(
        'Abs',
        inputs=['input'],
        outputs=['abs_out'],
    )

    reshape_node = onnx.helper.make_node(
        'Reshape',
        inputs=['abs_out', 'shape'],
        outputs=['reshape_out']
    )

    graph_def = helper.make_graph(
        [abs_node, reshape_node],
        'test_model',
        [input, shape],
        [reshape_out],
    )

    onnx_model = helper.make_model(graph_def, producer_name='test_model', opset_imports=[onnx.helper.make_opsetid("", 10)])
    onnx.checker.check_model(onnx_model)
    onnx.save(onnx_model, output_path)

def create_elu_concat_test_model(input_shapes, output_shapes, output_path):
    abs_in = helper.make_tensor_value_info('abs_in', TensorProto.FLOAT, input_shapes[0])
    con_1_in = helper.make_tensor_value_info('con_1_in', TensorProto.FLOAT, input_shapes[1])
    con_2_in = helper.make_tensor_value_info('con_2_in', TensorProto.FLOAT, input_shapes[2])

    abs_out = helper.make_tensor_value_info('abs_out', TensorProto.FLOAT, output_shapes[0])
    con_out = helper.make_tensor_value_info('con_out', TensorProto.FLOAT, output_shapes[1])

    abs_def = helper.make_node(
        'Abs',
        inputs=['abs_in'],
        outputs=['abs_out'],
    )

    concat_def = onnx.helper.make_node(
        'Concat',
        inputs=['con_1_in', 'abs_out', 'con_2_in'],
        outputs=['con_out'],
        axis=1
    )

    graph_def = helper.make_graph(
        [abs_def, concat_def],
        'test_model',
        [abs_in, con_1_in, con_2_in],
        [abs_out, con_out],
    )

    onnx_model = helper.make_model(graph_def, producer_name='test_model', opset_imports=[onnx.helper.make_opsetid("", 10)])
    onnx.checker.check_model(onnx_model)

    onnx.save(onnx_model, output_path)

out_dir = 'onnx_dyn_shapes_models'

if not os.path.exists(out_dir):
    os.makedirs(out_dir)

create_test_abs_model(['batch', 5, 5], ['batch', 5, 5], f'{out_dir}/abs_dynamic_batch.onnx')
create_test_abs_model(['batch', 5, 'x'], ['batch', 5, 'x'], f'{out_dir}/abs_dynamic_input.onnx')
create_test_abs_model([None], [None], f'{out_dir}/shape_not_provided_abs.onnx')
create_test_reshape([120], f'{out_dir}/reshape_static_inputs_dynamic_output.onnx')
create_test_abs_reshape([2, 'x', 6], f'{out_dir}/abs_reshape_dynamic_inputs_and_output.onnx')
create_elu_concat_test_model([['batch', 2, 'x'],['batch', 1, 'x'],['batch', 1, 'x']], [['batch', 2, 5], ['batch', 4, 5]], f'{out_dir}/abs_concat_dynamic_inputs_and_outputs.onnx')

onnx_dyn_shapes_models.zip

seanshpark commented 1 month ago

@mbencer , the main issue is that the model developer needs to export with fixed shape of ONNX, not with dynamic shape.

mbencer commented 1 month ago

@seanshpark That's true. In general it's the final purpose of https://github.com/Samsung/ONE/pull/13638. I changed the title/description to make it more clear ;-)

seanshpark commented 1 month ago

I'll be more specific. I do not want this feature in one-import-onnx as of now.

mbencer commented 1 month ago

@seanshpark Sorry, I misunderstood the previous message - static shape should be set before ONNX-Circle conversion via one-import-onnx. What do you think about providing a separate tool to specialize shapes in https://github.com/Samsung/ONE/tree/master/compiler/onnx-tools ?

seanshpark commented 1 month ago

These are what makes me hesitate to accept the feature from you: 1/ I don't know how you can handle UI tool problems and maintenance with customer service. 2/ I've not had an chance to work with you so I cannot just agree what you want to do.

mbencer commented 1 month ago

1/ I don't know how you can handle UI tool problems and maintenance with customer service.

In the current version if a user provides additional arguments --input_shapes and/or --output_shapes the shape is specialized (what's more shape inference provided by onnx lib is called) before conversion to tensorflow. Otherwise, there is no change in importing/conversion.

If we want to choose "separate tool" approach (IHMO rather less user friendly choice), I see it as a just python script/or package with setup.py (here to consider) with--input_path <PATH_TO_ONNX_MODEL> --output_path <PATH_TO_MODEL_WITH_FIXED_SHAPES> and also --input_shapes and --output_shapes cmd arguments.

mbencer commented 1 month ago

@seanshpark Are you ok with making https://github.com/Samsung/ONE/pull/13638 feature regular PRs and start to introduce it?

W would like also to precise:

1/ I don't know how you can handle UI tool problems and maintenance with customer service.

Could you elaborate a little more what you mean here? (I am not sure now if my previous response makes sense) ;-)

mbencer commented 3 weeks ago

@seanshpark Could you please take a look^^

mbencer commented 2 weeks ago

@seanshpark Could you please take a look^ ;-)