Open ywpkwon opened 1 year ago
Hi @ywpkwon ,
Batch processing was supported in TIDL at 8.1 (standalone only), the support for batch processing on OSRT level is added on TIDL 8.2 release. Could you give a try on the latest (or 8.2 or above) edgeai-tidl-tools release if you don't have a dependency on 8.1 ? and let us know?
I'm using TIDL 8.1 and trying a simple classification model (input: [N, 3, 64, 64], output: [N, 26]).
I think TIDL now supports batch processing (I guess since version 8..). I created a batch-2 onnx model and I believe compiling (TIDLCompilationProvider) worked well.
When I run inference (TIDLExecutionProvider), I checked the session info knows its batch shape is 2 and
input_data
also has batch 2, but the output is always batch of 1. For example, I added some prints (inedgeai-tidl-tools/examples/osrt_python/ort/onnxrt_ep.py
)and result is
I think output shape should be (2, 1, 1, 26). Can't I check batching results with this tool?