-
[TF Lite Micro (link - supported platforms)](https://www.tensorflow.org/lite/microcontrollers#supported_platforms) makes local node ML inferencing possible, enabling powerful example applications like…
mbz4 updated
3 months ago
-
I am trying to compile a TFLite object detection model for imx8qm using Sagemaker Neo. I supplied a .tflite file and specified the input config as `{"input":[1, 300, 300, 3]}`
But the compilation fai…
-
### 1. System information
- Google Colab Notebook
### 2. Code
_#### Model Definition ---------_
```
class CppTfTest(tf.Module):
def __init__(self, name=None):
super().__init…
-
下記のvitをtfliteに変換する。
https://github.com/taki0112/vit-tensorflow
-
Nils is it possible to create an integer only models so this could run on accelerators or frameworks such as ArmNN?
https://www.tensorflow.org/lite/performance/post_training_quantization#full_integer…
-
Good day,
I have the following model that I am using for inference on an esp32. The model is a minimally "forward only" definition in tensorflow for an early time-series classification model that I …
-
Hello..
Currently I am trying to bundle a deb package for Poricom which uses manga_ocr.
However, the torch dependencies of the transformers backend are making the deb excessively large.
```
$ fi…
-
Hi Everyone,
I tried to post quantize fastspeech2 model of TensorFlowTTS ,but got the error message, I was wondering if anyone knows the detials of this error and how to fix this ? Thanks!
Use …
-
No matter the size of the LSTM model, converting it with float16 optimization runs out of memory.
**Code to reproduce the issue**
[The code snippet to reproduce the issue on Google Colab](https://…
-
We want to implement RL on android device. Just wondering if it is possible to run tf-agents on android or to convert tf-agents to tf-lite. It will be great if someone can share some experience. Thank…
ssujr updated
3 years ago