tensorflow / models

Models and examples built with TensorFlow
Other
77.05k stars 45.77k forks source link

How can I save a frozen inference graph in TF2? #9270

Open aitormf opened 4 years ago

aitormf commented 4 years ago

I have trained a network to be able to use it in tensorflow.js but it works very slow because it has been converted from a saved model instead of a frozen graph as they have done with the pretrained models provided here.

to check that it was not my thing, I have taken the ssdlite_mobilenet_v2 (TF1.x) from this repository and I have tried to convert the graph as indicated here and the saved model and it has the same problem, the saved model is very slow with respect to the graph. so I would like to know how to export the frozen inference graph

lucky-xu-1994 commented 3 years ago

I have trained a network to be able to use it in tensorflow.js but it works very slow because it has been converted from a saved model instead of a frozen graph as they have done with the pretrained models provided here.

to check that it was not my thing, I have taken the ssdlite_mobilenet_v2 (TF1.x) from this repository and I have tried to convert the graph as indicated here and the saved model and it has the same problem, the saved model is very slow with respect to the graph. so I would like to know how to export the frozen inference graph

hi, have you solved the problem, if yes, how? thanks!