serizba / cppflow

Run TensorFlow models in C++ without installation and without Bazel
https://serizba.github.io/cppflow/
MIT License
774 stars 177 forks source link

Does cppflow::model has support for Tensorflow Lite models ? #238

Open bmiftah opened 1 year ago

bmiftah commented 1 year ago

I tested cppflow , to load and infer model saved as savedModel . Model loading and prediction time to be slower - taking up to 4 seconds. I try some way to optimize it and I came across the idea of converting the model in to a Tensorflow Lite model which is an optimized FlatBuffer format identified by the .tflite file extension). The conversion can be done following the method from the offical tensorflow page. My issue here is :- I wonder if such model has support in cppflow::model , can it be loaded and infered from? or Is there any tip to get a better speed in inference time , such as the possibility of freezing the model ? any help is very much appreciated !

Norhanelocla commented 1 year ago

did you know the answer please?

bmiftah commented 1 year ago

Hi, as you can see above, I didnt get reply for that issue. I still don't know how to use Lite model in cppflow( perhaps there could be effort towards this , but I don't know any yet). So , I go ahead with using frozen model. In the process , I followed some suggestion given here to solve some problems that came up during loading frozen model.