Closed zhiweihu1103 closed 1 year ago
Thanks for your interest. Generally, it takes around 80 epochs to achieve its best performance on the dev set under the default setting.
Then the epoch of training around 40 minutes is normal, right?
Yes, since language models are time-consuming compared with traditional KEs like TransE. You can also try using bert-tiny as the base model, which runs much faster, and achieves comparable performance on FB15K-237.
OK, I will try. Thank you for your reply. Good luck.
I ran the program on the V100 using the script below, but it takes 42 minutes to run one epoch, I want to ask how long it took you to run the model?
python main.py --batch_size 16 --plm bert --contrastive --self_adversarial --data wn18rr --task LP