apoorvumang / CronKGQA

ACL 2021: Question Answering over Temporal Knowledge Graphs
MIT License
93 stars 19 forks source link

pretrained model #12

Open xdcui-nlp opened 2 years ago

xdcui-nlp commented 2 years ago

Hello, I'd like to ask why I get exactly the same result with distilbert-base-uncased and roberta-base?

xdcui-nlp commented 2 years ago

Do I need to delete temp.ckpt before the train?

apoorvumang commented 2 years ago

Can you elaborate more on the exact changes you made in code?

xdcui-nlp commented 2 years ago
截屏2021-12-24 上午12 11 08

these code

apoorvumang commented 2 years ago

which model did you use for the results? eg. EmbedKGQA_complex, model1 etc.

It would be helpful if you could give the exact line numbers you changed + the command you ran. Then I can check the same on my side.