LARS-research / RED-GNN

Knowledge Graph Reasoning with Relational Digraph. WebConf 2022
59 stars 12 forks source link

Training time for FB15k-237 #10

Closed Jason-Mason closed 5 months ago

Jason-Mason commented 10 months ago

Hello, thank you for the work. I am trying to train the model on the FB15k-237 dataset with the default hyperparameter setting and 2080Ti GPU. However, it is estimated that training takes approximately 5 hours per epoch. Can you provide a training time for when you did the experiment and verify if my situation is reasonable?

yzhangee commented 10 months ago

Yes, it generally takes several hours. FB15k-237 is denser than the other datasets, resulting in a much larger subgraph size. Considering that 2080Ti is a bit old, 5 hours per epoch makes sense.

If you have a larger GPU device, like A100, the batch size and testing batch size can be set larger to achieve faster running.