githubharald / SimpleHTR

Handwritten Text Recognition (HTR) system implemented with TensorFlow.
https://towardsdatascience.com/2326a3487cd5
MIT License
1.99k stars 893 forks source link

Using model checkpoint for inference. #49

Closed ritzyag closed 5 years ago

ritzyag commented 5 years ago

I have trained a model on my dataset and the model performs decently on validation set. I get an accuracy of 72% using --wordbeamsearch. Now I want to load the model for inference on device. It can be done by converting the model to the .pb format. Now to convert the saved checkpoint to the frozen model I need the name of the output node. I am confused which output node name should I use? Is there a node which can directly give the recognized text (with word beam search decoding) given the input image? Or some other output node name has to be used?

Thank You :)

githubharald commented 5 years ago

can't provide any support on this.

contactvpatel commented 5 years ago

@githubharald Can you please let us know what is name of Output Node or way to figure out Output Node Name? It's a required parameter to freeze a model to use it in Production.