asus4 / tf-lite-unity-sample

TensorFlow Lite Samples on Unity
861 stars 252 forks source link

The Interpreter must be instantiated after each inference. #260

Closed francescoyou97 closed 1 year ago

francescoyou97 commented 1 year ago

Environment

Problem I have a simple GameObject which extends MonoBehavior. In the Start method I go to load a model in tflite format (already working on Windows / python and Android). In the Update method, after making the first inference that works correctly, for the following I receive an output with only 0 for each class. I then tested that, by loading the model before any inference, I get correct probability values for all classes. The question is, is it actually necessary to reload and reallocate the model tensors before any inference? The model is made up of LSTM, I don't know if this can help.

To Reproduce using System; using System.IO; using System.Linq; using TensorFlowLite; using UnityEngine;

public class Classifier : MonoBehaviour { private Interpreter _interpreter ;

private void Start() { InitClassifier(); }

private void InitClassifier() { var options = new InterpreterOptions() { threads = 4, useNNAPI = true, };

try
{
  var path = "path";
  interpreter = new Interpreter(File.ReadAllBytes(path), options);
  interpreter.AllocateTensors();
}
catch
{
  Debug.Log("Error During Model Loading");
}

}

private void Inference(float[,] input) { var output = new float[1, 84]; //InitGesturesClassifier(); ///////////////////////////////////////////////////////// To uncomment. try { interpreter.SetInputTensorData(0, input); interpreter.Invoke(); interpreter.GetOutputTensorData(0, output); } catch { Debug.Log("Error During model Inference."); } string prob = "Prob: "; for (var i = 0; i < output.Length; i++) { prob += output[0, i] + ", "; } Debug.Log(prob); } }

Expected behavior I expect the model to be instantiated a single time.

Screenshots If applicable, add screenshots to help explain your problem.

Additional context This is the output when I comment the InitClassifier() Prob: 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...

This is the output when I add the InitClassifier() Prob: 7,104152E-07, 1,24909E-05, 1,211954E-06, 1,081635E-05, 0,02810379, 7,72827E-07, 7,61087E-07, 7,44329E-07, 8,9061E-07, 8,337704E-07, 9,372902E-07, 8,054589E-07, 8,193072E-07, 7,71486E-07, 7,609186E-07, 7,700357E-07, 9,047312E-08, 7,653939E-07, 7,043034E-07, 8,130228E-07, 8,82825E-07, 7,779054E-07, 7,779989E-07, 8,052423E-07, 2,023554E-05, 5,416047E-08, 2,098425E-08, 4,542941E-06, 0,004308983, 0,9487795, 5,989123E-05, 1,518722E-06, 3,647282E-06, 4,470758E-07, 7,680218E-07, 7,699315E-07, 8,577263E-07, 8,360141E-07, 8,089091E-07, 0,01855268, 8,564847E-07, 7,743955E-07, 8,874179E-07, 7,213466E-07, 7,168855E-07, 8,263641E-07, 9,438788E-07, 7,054812E-07, 6,870035E-07, 8,223429E-07, 7,539402E-07, 8,012559E-07, 7,191746E-07, 7,732921E-07, 8,277065E-07, 8,245551E-07, 8,984515E-07, 7,030197E-07, 7,01389E-07, 7,558768E-07, 8,584423E-07, 7,835596E-07, 7,927347E-07, 1,102207E-06, 8,635316E-05, 7,372397E-07, 7,871254E-07, 8,559409E-07, 8,441499E-07, 6,981523E-07, 7,616548E-07, 8,006127E-07, 7,276042E-07, 8,141548E-07, 7,63205E-07, 9,659776E-07, 7,901572E-07, 7,085409E-07, 8,889036E-07, 8,057924E-07, 9,725479E-07, 8,468212E-07, 8,811058E-07, 7,928383E-07, ...

francescoyou97 commented 1 year ago

Excuse me. I wrote that I do inference in the Update method. I actually wrote the Inference method that I call externally when needed.

asus4 commented 1 year ago

Hey @francescoyou97 The code looks ok. But we can not reproduce your issue without your model and code. Can you share the project including the TFLite file?

francescoyou97 commented 1 year ago

model and code.zip

I cannot share the entire project. Insede the zip you can find the code and the model.tflite. If you can help me, you only need to add the path of the model inside the code. Thanks for the answer.

stale[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Alexiush commented 1 year ago

Hi! I have the same behaviour (RNN-like model too, GRU actually), after the first inference I recieve 0 as output. I'm also returning states as another output and they change as they should. ModelGitArchive.zip

Alexiush commented 1 year ago

Oh, it looks like I've sent outdated code, that may have some problems other than one mentioned in issue, here is version I'm currently using: TextGenerator.txt

asus4 commented 1 year ago

@francescoyou97 @Alexiush @qlee01 Sorry for late replying:

I looked into this a bit. and I guess the array in the SetInputTensorData and GetOutputTensorData should be the same instance per Interpreter as the array's pointers are cached in inputDataHandles.

https://github.com/asus4/tf-lite-unity-sample/blob/4ae9930f2cc0eb80ab1de29c6fd40f538fa05a8d/Packages/com.github.asus4.tflite/Runtime/Interpreter.cs#L115-L119

Can you try allocating input and output arrays as a class member and reusing them?

Alexiush commented 1 year ago

@asus4 I tried to make it like that and now it does not spam 0 after first inference, however it's still doesn't work as it should, for me to run as intended model still needs to be reinitialized.

I checked if it's connected to operations I do in initalization, but it doesn't look so, cause when done without reinstantiating interpreter nothing changes.

Also my workaround needs local array, not class member, I wonder why: Workaround.zip

qlee01 commented 1 year ago

@francescoyou97 @Alexiush @qlee01 Sorry for late replying:

I looked into this a bit. and I guess the array in the SetInputTensorData and GetOutputTensorData should be the same instance per Interpreter as the array's pointers are cached in inputDataHandles.

https://github.com/asus4/tf-lite-unity-sample/blob/4ae9930f2cc0eb80ab1de29c6fd40f538fa05a8d/Packages/com.github.asus4.tflite/Runtime/Interpreter.cs#L115-L119

Can you try allocating input and output arrays as a class member and reusing them?

Hey, I tried to use them as class members, and only change when needed, still same result (all output values "0").

Edit: Not quite correct, as I had two Interpreter running. Now I am getting values, but something else seems off (the values don't seem correct).

qlee01 commented 1 year ago

I finally managed to resolve the issue in my case, thanks to advice from @asus4 : I also needed to use outputshape arrays as class members, now it works as intended.

Alexiush commented 1 year ago

I resolved my issue too, it was about how it was running asynchronously.

asus4 commented 1 year ago

Cool, I will close this then. It would probably be better to write some comments to the SetInputTensorData and GetOutputTensorData functions.

ChristianStaschinski commented 11 months ago

@qlee01 could you please share how you used the arrays as class members? I still get "0" output values. This issue could be closed. https://github.com/voxell-tech/UnityTTS/issues/3