I am trying to synthesize form a checkpoint after training to 37k. The follosing files are in the logs-tacotron folder
model.ckpt-37000.meta
model.ckpt-37000.index
model.ckpt-37000.data-00000-of-00001
so I used the following command to synthesize
python .\demo_server.py --checkpoint C:\Users\User\tacotron\logs-tacotron\model.ckpt-37000.index
and I got the error below
NotFoundError (see above for traceback): Restoring from checkpoint failed. This is most likely due to a Variable name or other graph key that is missing from the checkpoint. Please ensure that you have not altered the graph expected based on the checkpoint. Original error:
Tensor name "model/inference/decoder/output_projection_wrapper/Location_Sensitive_Attention/attention_bias" not found in checkpoint files C:\Users\User\tacotron\logs-tacotron\model.ckpt-37000.index
[[Node: save/RestoreV2 = RestoreV2[dtypes=[DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, ..., DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_save/Const_0_0, save/RestoreV2/tensor_names, save/RestoreV2/shape_and_slices)]]
[[Node: save/RestoreV2/_79 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device_incarnation=1, tensor_name="edge_84_save/RestoreV2", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:GPU:0"]()]]
I am trying to synthesize form a checkpoint after training to 37k. The follosing files are in the logs-tacotron folder
so I used the following command to synthesize
python .\demo_server.py --checkpoint C:\Users\User\tacotron\logs-tacotron\model.ckpt-37000.index
and I got the error below