bdi-lab / InGram

InGram: Inductive Knowledge Graph Embedding via Relation Graphs (ICML 2023)
Other
50 stars 25 forks source link

Dataset #6

Closed fantuan812 closed 7 months ago

fantuan812 commented 7 months ago

Hi, thanks for sharing your codes. I would like to ask how the msg.txt in the dataset ensures that it contains all entities and relationships, and whether there is a split code for ℰinf.

jaejunlee714 commented 7 months ago

Hi, thanks for your interest in our paper! I'm sorry for late reply. We've uploaded the codes for splitting the data into train/msg/valid/test. We used a spanning tree of the inference graph to ensure that all entities are contained. For relations, we first randomly chose the labels of the edges of the spanning tree, and then randomly chose the triplets that contains the missing relations.

Best, Jaejun Lee

fantuan812 commented 7 months ago

Thank you for your reply. Recently, I tried to apply your algorithm to non-inductive link prediction, and I found that the effect was very poor when using the original data set FB15k-237. I would like to know what parameters can be adjusted for such a large data set

jaejunlee714 commented 7 months ago

I've never tried InGram on the transductive dataset. I think adjusting (1) the (hidden) dimensions of the entities and relations, (2) margin, (3) learning rate would be helpful.

Best, Jaejun Lee.

fantuan812 commented 7 months ago

Thanks for your advice, I will try these. Good luck with your research

jaejunlee714 commented 7 months ago

I'll close the issue. Wish you a good luck with your research!