eloquentarduino / EloquentTinyML

Eloquent interface to Tensorflow Lite for Microcontrollers
288 stars 57 forks source link

Debug support? #4

Closed Workshopshed closed 4 years ago

Workshopshed commented 4 years ago

I'm trying to get a model working on the Arduino MKR board and I believe it's hanging in the constructor for Eloquent::TinyML::TfLite.

In my code it does not get passed "Initialising..."

image

TextClassifierExample.ino.txt text_model.h.txt

Looking at the porting advice it mentions that it should be possible to have a debug logger.

https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/micro/README.md

There is some more advice on this in

https://github.com/eloquentarduino/EloquentTinyML/blob/master/src/tensorflow/lite/micro/debug_log.cpp

I tried adding this to mine but with no effect.

#include <arduino.h>
extern "C" void DebugLog(const char* s) { Serial.print(s); } // fprintf(stderr, "%s", s); 
eloquentarduino commented 4 years ago

This is the stacktrace I get on my ESP32

0x400e61d1: tflite::BytesRequiredForTensor(tflite::Tensor const&, unsigned int*, unsigned int*, tflite::ErrorReporter*) at ~/Arduino/libraries/EloquentTinyML/src/flatbuffers/flatbuffers.h line 227
0x400e6745: tflite::internal::InitializeRuntimeTensor(tflite::SimpleMemoryAllocator*, tflite::Tensor const&, flatbuffers::Vector   > const*, tflite::ErrorReporter*, TfLiteTensor*) at ~/Arduino/libraries/EloquentTinyML/src/tensorflow/lite/micro/micro_allocator.cpp line 267
0x400e68f2: tflite::MicroAllocator::Init() at ~/Arduino/libraries/EloquentTinyML/src/tensorflow/lite/micro/micro_allocator.cpp line 353
0x400e6936: tflite::MicroAllocator::MicroAllocator(TfLiteContext*, tflite::Model const*, unsigned char*, unsigned int, tflite::ErrorReporter*) at ~/Arduino/libraries/EloquentTinyML/src/tensorflow/lite/micro/micro_allocator.cpp line 375
0x400e6f85: tflite::MicroInterpreter::MicroInterpreter(tflite::Model const*, tflite::OpResolver const&, unsigned char*, unsigned int, tflite::ErrorReporter*) at ~/Arduino/libraries/EloquentTinyML/src/tensorflow/lite/micro/micro_interpreter.cpp line 56
0x400d1983: Eloquent::TinyML::TfLite16u, 1u, 4096u>::TfLite(unsigned char*) at ~/Arduino/libraries/EloquentTinyML/src/EloquentTinyML.h line 46
0x400d1a30: setup() at ~/Arduino/projects/eloquent.blog/Ticket__Tinyml/Ticket__Tinyml.ino line 24
0x400e82fb: loopTask(void*) at /home/simone/.arduino15/packages/esp32/hardware/esp32/1.0.4/cores/esp32/main.cpp line 14
0x40088385: vPortTaskWrapper at /home/runner/work/esp32-arduino-lib-builder/esp32-arduino-lib-builder/esp-idf/components/freertos/port.c line 143

It seems a memory allocation problem. I tried incrementing the TENSOR_ARENA_SIZE to 8*1024 but I get StackOverflow error. Please try to do the same on your board: experiment with different values and report back if something changes. In the meantime I will try to track the problem down.

Workshopshed commented 4 years ago

Thanks. I did try it with 10*1024 but no success. One of my challenges seems to be to get the model size down small enough. I'll see if I can optimise it further.

Workshopshed commented 4 years ago

Have been experimenting with smaller models and also the optimisations. I got the following error from my serial monitor which does suggest that the debug code is working to some extent?

Initialising... Type FLOAT16 (10) not is not supported Failed to initialize tensor 1 MicroAllocator: Failed to initialize. AllocateTensors() failed

eloquentarduino commented 4 years ago

Can you please send me your Python code? I need to see your NN structure and exporting configuration.

Il giorno sab 23 mag 2020 alle ore 13:19 Andy notifications@github.com ha scritto:

Have been experimenting with smaller models and also the optimisations. I got the following error from my serial monitor which does suggest that the debug code is working to some extent?

Initialising... Type FLOAT16 (10) not is not supported Failed to initialize tensor 1 MicroAllocator: Failed to initialize. AllocateTensors() failed

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/eloquentarduino/EloquentTinyML/issues/4#issuecomment-633030062, or unsubscribe https://github.com/notifications/unsubscribe-auth/AOA7GNM3RT4I7EXKJ7ND3HDRS6WN5ANCNFSM4NA5P4WQ .

Workshopshed commented 4 years ago

This is the one I was using at the time, a simple 2 dense layer model with really poor training.

model = tf.keras.Sequential([
  tf.keras.layers.Dense(16, activation='relu', input_shape=(8,)),
  tf.keras.layers.Dense(2)
  ])

With the following optimisation

optimizers = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float16]

https://github.com/Workshopshed/TinyMLTextClassification/blob/e33e6e811367df6bbdd26d3ebf4a0be3821c4ab2/key_classification_rnn.ipynb

eloquentarduino commented 4 years ago

Sorry for my delay. The Notebook reports a dummy example: does it work on your microcontroller? Please post the exact code your using and the dataset you want to classify (in .CSV format since I can't install tensorflow-text).

Workshopshed commented 4 years ago

Back to the issue at hand, debugging the constructor.

If we look back to TextClassifierExample.ino.txt you'll see that I tried to create the variable in setup so that it could be created after serial was initialised. That does not work. You have to put the declaration outside of setup for it to work.

Eloquent::TinyML::TfLite<NUMBER_OF_INPUTS, NUMBER_OF_OUTPUTS, TENSOR_ARENA_SIZE> ml((unsigned char*)model_data);

So that would mean that serial would never be available when you actually need it.

eloquentarduino commented 4 years ago

Does it work fine if you put the declaration before setup?

Il giorno mar 2 giu 2020 alle ore 09:33 Andy notifications@github.com ha scritto:

Back to the issue at hand, debugging the constructor.

If we look back to TextClassifierExample.ino.txt https://github.com/eloquentarduino/EloquentTinyML/files/4630040/TextClassifierExample.ino.txt you'll see that I tried to create the variable in setup so that it could be created after serial was initialised. That does not work. You have to put the declaration outside of setup for it to work.

Eloquent::TinyML::TfLite<NUMBER_OF_INPUTS, NUMBER_OF_OUTPUTS, TENSOR_ARENA_SIZE> ml((unsigned char*)model_data);

So that would mean that serial would never be available when you actually need it.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/eloquentarduino/EloquentTinyML/issues/4#issuecomment-637349668, or unsubscribe https://github.com/notifications/unsubscribe-auth/AOA7GNOSTWSOV7YSXPSMTEDRUSTM7ANCNFSM4NA5P4WQ .