Closed NoobFang closed 8 years ago
I have exported the model as a graphdef protobuf that can be evaluated with a new module validate_on_lfw_new.py
.
It should just be to download the model and point to it when running evaluation, something like
python validate_on_lfw_new.py ~/datasets/lfw/lfw_mtcnnpy_160 ~/models/export/20161030-023650.pb
That seems work for evaluation.
However, if the weights in .pb file are from a freezed graph, can I just import_graph_def it and re-train the model?
If you want to retrain the model another alternative is to create the model programatically, i.e. by calling inference(...) for the model, like it's done in facenet_train_classifier.py.
And then you can load the file containing the parameters, i.e. model-20161030-023650.ckpt-80000
using saver.restore(...) in the same way as it's done in load_model(...).
OK, I get it now. Thanks for your explanation!
Which tensorflow version is the model trained for the create the graph file then?
Hi,
After I download the pre-trained model "20161030-023650", I want to restore the model by load_model() in "facenet.py". Unfortunately, a KeyError occurs:
In [15]: facenet.load_model('20161030-023650/', meta_file, ckpt_file)
KeyError Traceback (most recent call last)