Open aitormf opened 4 years ago
I have trained a network to be able to use it in tensorflow.js but it works very slow because it has been converted from a saved model instead of a frozen graph as they have done with the pretrained models provided here.
to check that it was not my thing, I have taken the ssdlite_mobilenet_v2 (TF1.x) from this repository and I have tried to convert the graph as indicated here and the saved model and it has the same problem, the saved model is very slow with respect to the graph. so I would like to know how to export the frozen inference graph
hi, have you solved the problem, if yes, how? thanks!
I have trained a network to be able to use it in tensorflow.js but it works very slow because it has been converted from a saved model instead of a frozen graph as they have done with the pretrained models provided here.
to check that it was not my thing, I have taken the ssdlite_mobilenet_v2 (TF1.x) from this repository and I have tried to convert the graph as indicated here and the saved model and it has the same problem, the saved model is very slow with respect to the graph. so I would like to know how to export the frozen inference graph