facebookresearch / mlqe

We release a dataset based on Wikipedia sentences and the corresponding translations in 6 different languages along with the scores (scale 1 to 100) generated though human evaluations that represent the quality of the translations.Paper Title Unsupervised Quality Estimation for Neural Machine Translation
Creative Commons Attribution Share Alike 4.0 International
80 stars 14 forks source link

How to get the softmax distribution? #7

Open WJMacro opened 3 years ago

WJMacro commented 3 years ago

Hi, Fomicheva I'm trying to reproduce your result. But I found something tricky. In your paper you've calculated the softmax entropy at each decode step. But since the decoder uses beam search to find a better decode sequence, we got beam_size prob distributions at each decode step. How did you track the best sequence and extract its prob distribution in beam search? Could you please share your code?

mfomicheva commented 3 years ago

Hi WJMacro, We compute the entropy in a force decoding regime. That is to say, we re-score the already generated translations. So search is not an issue. I hope this helps.