Open JasonYCHuang opened 6 years ago
Same issue here.
I did a dirty fix by adding:
hparams.num_translations_per_input = flags.num_translations_per_input
hparams.beam_width = flags.beam_width
in nmt.py, in run_main(.)
just after hparams
is loaded, it works just fine.
Thanks @ttrouill It also works from my site.
Looks like: inference uses hparams
from training, and it is not updated for inference params.
https://github.com/tensorflow/nmt/blob/master/nmt/nmt.py#L577
Are you going to create a PR? I can create one if you are not available. Let me know your preference.
Go ahead :) Though it should probably be something cleaner and more general (to pass other unpassed parameters too) than what I proposed.
@ttrouill @JasonYCHuang are you still planning to make the PR?
If I provide one sentence, is it possible to inference top-k predictions, not only the best prediction?
Looks like this work provides this function, but I didn't success on it, and always get only the best prediction.
I put only one sentence in
/tmp/inference/one.rct
, and set--num_translations_per_input=5
in inference mode, but get only one prediction, not 5 candidates.I use
tensorflow-nightly
on themaster
branch, commit 365e73.Could you give me some directions?
training script
inference script