Closed brunofmf closed 2 years ago
hi Bruno,
Unfortunately it does not yet work with TFLite -- it's in our list of TODOs, but not there yet.
In the mean time we do have the C++ interface in the associated Yggdrasil library: you can load the model trained in TF from the C++ library and run it there. The C++ library is pretty lightweight (no dependency to TensorFlow engine) and works very fast in CPUs. Maybe it can be easily compiled to phones ? (We haven't tried yet though)
I'm marking this as a duplicate of #5 (about TF-Lite) because when we integrate TF-DFs ops into TF Lite we hope it will also work on on-device.
Ok Jan, Thanks!
Good day.
Just to check: Would I be able to save a tfdf model (in, for example, a tflite format) and then load the model to perform on-device inference in smartphones?
Thank you. Cheers.