Closed mansnils closed 2 years ago
@advaitjain Has this been socialized on develpers@tensorflow.org? I don't see it there yet.
@advaitjain Has this been socialized on develpers@tensorflow.org? I don't see it there yet.
No it hasn't. I became the RFC sponsor a week ago with c236808
(#408)
The updated decision is that the proposed script will be implemented in the TFLM repository instead of the Tensorflow repo and additional discussion will happen on https://github.com/tensorflow/tflite-micro/issues/911.
https://github.com/tensorflow/tflite-micro/issues/911#issuecomment-1030287467 has some additional context.
@mansnils @ematejska, if one of you could close this PR, that would be great.
Sure, thanks for the explanation. Closing.
This RFC will be open for comment until Friday, February 25th, 2022.
MicroMutableOpResolver Generation
Objective
The objective is to simplify the process of parsing a tflite file for operators, and to generate code that can easily be integrated into an application.
Goals
Motivation
When using an interpreter, you need an operator resolver. For this, you have two options:
1) Using the AllOpsResolver will link all the TFLM operators into the executable, which will add significantly to the memory footprint. A model typically only requires a small subset of operators and considering that microcontrollers have a limited memory size, MicroMutableOpResolver is a better option. Example usage of AllOpsResolver:
2) Using the MicroMutableOpResolver will include the operators specified by the user. This requires manually finding out which operators are used in the model, through the use of a visualization tool. Due to the smaller memory footprint this is preferable for real world usage, but it does require some more work for the user. This becomes impractical when running automated tests as you manually need to find the operators and add them. Example usage: