Closed bhavikapanara closed 4 years ago
Hi Bhavika,
This is a common problem with neural language models - longer outputs have a lower overall likelihood, so the model favours shorter ones.
There is a parameter length_penalty
that tries to compensate for this - you could try evaluating the model with a higher value to encourage longer output, by adding something like --length_penalty 0.2
when you run evaluation.
Hi Bhavika,
This is a common problem with neural language models - longer outputs have a lower overall likelihood, so the model favours shorter ones.
There is a parameter
length_penalty
that tries to compensate for this - you could try evaluating the model with a higher value to encourage longer output, by adding something like--length_penalty 0.2
when you run evaluation.
Thanks much, @tomhosking for your quick replay.
I will try this.
Its get a better result by adjusting length_penalty
parameter.
Thanks, @tomhosking for sharing the code.
In the majority of criteria, your code is able to generate good quality questions. However, in some case, it generates the question with truncating useful phrases.
For example:
sentence : Narendra Damodardas Modi is an Indian politician serving as the 14th and current Prime Minister of India since 2014. Selected Answer: Narendra Damodardas Modi Generated Question: who is an indian politician serving ?
sentence: As of 2017, Ahmedabad's estimated gross domestic product was $68 billion. selected Answer : 2017 Generated Question: in what year was ahmedabad 's estimated gross domestic product ?
Can you please help me to solve this problem? Why the model dropping out the phrase/word while generating question.
Thanks Bhavika