tensorflow / nmt

TensorFlow Neural Machine Translation Tutorial
Apache License 2.0
6.36k stars 1.96k forks source link

Error in Changing Inference Batch Size #377

Open raghavgurbaxani opened 5 years ago

raghavgurbaxani commented 5 years ago

Hi

When I try to change the inference batch size by setting override_loaded_hparams=True and --infer_batch_size=128 -

python -m nmt.nmt --out_dir=/tmp/nmt_model --inference_input_file=/tmp/my_infer_file.vi --inference_output_file=/tmp/nmt_model/output_infer --infer_batch_size=128 --override_loaded_hparams=True

I get the following error -

"a mismatch between the current graph and the graph") tensorflow.python.framework.errors_impl.InvalidArgumentError: Restoring from checkpoint failed. This is most likely due to a mismatch between the current graph and the graph from the checkpoint. Please ensure that you have not altered the graph expected based on the checkpoint. Original error:

Assign requires shapes of both tensors to match. lhs shape= [7709,32] rhs shape= [7709,128] [[{{node save/Assign_11}} = Assign[T=DT_FLOAT, _class=["loc:@embeddings/encoder/embedding_encoder"], use_locking=true, validate_shape=true, _device="/job:localhost/replica:0/task:0/device:GPU:0"](embeddings/encoder/embedding_encoder, save/RestoreV2/_21)]]

Could you please advise on what I'm doing wrong ?