Open jakobkilian opened 10 months ago
I think I found the issue and I tried to solve it. Maybe I don't fully understand the code and am wrong – this is just a proposal and a way for me to use the library for my current projects that I want to share:
So I checked the header files of your lib and saw that (in comparison to the older Eloquent TinyML library) the predict() function is not overloaded to expect either int
or float
values. Instead it now is a template that however always uses input->data.int8[i]
to put data into the model. To my understanding, the use of predict() and a float value (as described in the SineExample.ino example) cannot work correctly.
I therefore forked the repo to implement a overload for int
and float
values. This is only temporary. When fixed I will delete it again...
I couldn't test this with models that expect int8 values as I can't find out how to build them. Generally I am very confused of where to actually do the quantization as the models I build usually do that internally and just expect the input to be float... Would be happy about an explanation
Thanks anyway, hope this helps
Hi! I am actively trying to get this library to run, but the inference always returns 0. Same thing happens when using the old Eloquent TinyML library. I used the old library a lot and therefore kept Tensorflow at Version 2.1.1 to be compatible. Is there any version recommendation for that new library that maybe causes the issue? I was using:
Python 3.11.7 Numpy 1.26.3 TensorFlow 2.15.0
I also tried to export the model with everywhereml, same result. No errors with
tf.exception
. Any ideas? Maybe you can provide your model for theSineExample.ino
to test? Tested my tlflite with python on the computer → there it works... I am happy to help, if I can. My expertise is medium only...