microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.87k stars 2.94k forks source link

How to create input tensor for multiple input examples (batch mode inference)? #18881

Open paranjapeved15 opened 11 months ago

paranjapeved15 commented 11 months ago

Describe the issue

I am looking for some code examples to create input of the format map(str -> OnnxTensor) for multiple input examples (batch mode inference). Can you please point me to it?

To reproduce

NA

Urgency

No response

Platform

Mac

OS Version

13.5.2

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.16.2

ONNX Runtime API

Java

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

Craigacp commented 11 months ago

Create a buffer of size [batch_size, feature_dimension] and then write the examples to it. For example this method writes a batch of examples to a single tensor used as input - https://github.com/oracle-samples/sd4j/blob/main/src/main/java/com/oracle/labs/mlrg/sd4j/TextEmbedder.java#L218.

paranjapeved15 commented 11 months ago

@Craigacp I have inputs features with various datatypes - float, string. How can I structure my inputs in that case? I guess this above method would not work in that case?

Craigacp commented 11 months ago

How are you currently formatting those inputs for the ONNX model? If it accepts multiple tensor inputs and the leading dimension of all of those inputs is -1 then it's still expecting batches so batch each tensor separately.

github-actions[bot] commented 10 months ago

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.