Closed indrajithi closed 6 years ago
Well, the main reason is that this is a toy model used to accompany an introductory presentation, not a production question generation model. However, I'm curious: if you apply the fix suggested in https://github.com/Maluuba/qgen-workshop/issues/3#issuecomment-383826450, do the questions get any better?
I ran it for two epochs with that typo fixed and got
Question: what country is the most ?
Answer: international
Question: what is the most ?
Answer: oil supply
Question: what is the name ?
Answer: terror
which seems more coherent than before.
After training the model the questions generated are very bizarre. The model is trained and (test|train|dev).csv is generated as per the instructions. Can you tell me what might be the reason for this?