cmusphinx / g2p-seq2seq

G2P with Tensorflow
Other
670 stars 194 forks source link

checkpoint - code mismatch #101

Closed georgesterpu closed 6 years ago

georgesterpu commented 6 years ago

Hi Just cloned the repo a few minutes ago, ran the setup script and downloaded the pre-trained model. Launching g2p-seq2seq crashes with the following message:

NotFoundError (see above for traceback): Key embedding_attention_seq2seq/embedding_attention_decoder/attention_decoder/Attention_0/bias not found in checkpoint [[Node: save_1/RestoreV2 = RestoreV2[dtypes=[DT_FLOAT, DT_INT32, DT_FLOAT, DT_FLOAT, DT_FLOAT, ..., DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_save_1/Const_0_0, save_1/RestoreV2/tensor_names, save_1/RestoreV2/shape_and_slices)]]

Has the code has been updated after releasing the pre-trained model ?

nurtas-m commented 6 years ago

Hello! Do you launch the g2p-seq2seq project on a master branch? What is your current tensorflow version?

In a few hours we will deploy project from t2t branch to the master branch.

georgesterpu commented 6 years ago

Hi @nurtas-m Indeed, I've cloned the master branch, and I'm using TensorFlow 1.6. Has the published model been trained using the code on the t2t branch ?

nurtas-m commented 6 years ago

That project on master branch is outdated and not compatible with tf >= 1.0. You need either run project on master branch with old tf, or use project on t2t branch with new tf.

georgesterpu commented 6 years ago

I see, thanks. Will wait for your update when it's ready. Thank you for making available this useful project.