malllabiisc / CompGCN

ICLR 2020: Composition-Based Multi-Relational Graph Convolutional Networks
Apache License 2.0
597 stars 107 forks source link

Random adjacency matrices do not affect the performance #32

Open zhanqiuzhang opened 3 years ago

zhanqiuzhang commented 3 years ago

Hi, thanks for sharing the code!

I find that after randomly breaking the adjacency matrices, the performance of CompGCN remains unchanged (0.334, DistMult+multiplication). The codes in run.py that I have changed are as follows.

for sub, rel, obj in self.data['train']:
    obj = random.randint(0, self.p.num_ent)
    edge_index.append((sub, obj))
    edge_type.append(rel)

# Adding inverse edges
for sub, rel, obj in self.data['train']:
    obj = random.randint(0, self.p.num_ent)
    edge_index.append((obj, sub))
    edge_type.append(rel + self.p.num_rel)

Did I have any misunderstanding about the codes?

nguyenhungquang commented 3 years ago

I think the main contribution to performance is the decoder. I tried to remove convolution layer but still obtain similar results