tensorflow / tflite-support

TFLite Support is a toolkit that helps users to develop ML and deploy TFLite models onto mobile / ioT devices.
Apache License 2.0
378 stars 128 forks source link

How to handle dynamic output tensors with `tflite.runForMultipleInputsOutputs` #962

Open hello-fri-end opened 10 months ago

hello-fri-end commented 10 months ago

I'm working with a text to speech model so I cannot predict the size of the output tensors beforehand. According to the documentation, "some models have dynamic outputs, where the shape of output tensors can vary depending on the input. There's no straightforward way of handling this with the existing Java inference API, but planned extensions will make this possible." What's the non-straightforward way of handling it? For tflite.run, it says [here](https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/Tensor#asReadOnlyBuffer()) that you can pass null as output but the same doesn't work with runMultipleInputsOutputs.

P.S: I cannot use a large enough buffer than can handle all possible inputs that can be passed to the model because I actually I have divided my model into 3 tflite files and I need to pass the output of one model to another model and therefore it has to be of the exact size that the model expects..