TexasInstruments / edgeai-tidl-tools

Edgeai TIDL Tools and Examples - This repository contains Tools and example developed for Deep learning runtime (DLRT) offering provided by TI’s edge AI solutions.
Other
121 stars 27 forks source link

batch processing about this tool? #32

Open ywpkwon opened 1 year ago

ywpkwon commented 1 year ago

I'm using TIDL 8.1 and trying a simple classification model (input: [N, 3, 64, 64], output: [N, 26]).

I think TIDL now supports batch processing (I guess since version 8..). I created a batch-2 onnx model and I believe compiling (TIDLCompilationProvider) worked well.

When I run inference (TIDLExecutionProvider), I checked the session info knows its batch shape is 2 and input_data also has batch 2, but the output is always batch of 1. For example, I added some prints (in edgeai-tidl-tools/examples/osrt_python/ort/onnxrt_ep.py)

    print('batch from session info:', input_details[0].shape)
    print('input shape:', input_data.shape)
    output = list(sess.run(None, {input_name: input_data}))
    print('output shape:', output[0].shape)

and result is

batch from session info: [2, 3, 64, 64]
input shape: (2, 3, 64, 64)
output shape: (1, 1, 1, 26)

I think output shape should be (2, 1, 1, 26). Can't I check batching results with this tool?

fadilmohd007 commented 1 year ago

Hi @ywpkwon ,

Batch processing was supported in TIDL at 8.1 (standalone only), the support for batch processing on OSRT level is added on TIDL 8.2 release. Could you give a try on the latest (or 8.2 or above) edgeai-tidl-tools release if you don't have a dependency on 8.1 ? and let us know?