acbull / pyHGT

Code for "Heterogeneous Graph Transformer" (WWW'20), which is based on pytorch_geometric
MIT License
783 stars 162 forks source link

Question about attention mechanism #31

Open smayru opened 3 years ago

smayru commented 3 years ago
スクリーンショット 2021-01-18 17 45 37

Thank you for providing the code. I have a question about Fig.5 in your paper. In my understanding, the value of attention is obtained for each pair of (s,e,t) based on Eq.(3). In Fig.5, you seem to obtain the value of attention for each relation (e), which is irrelevant to s and t. For example, the value of attention for is_published_at is 0.970 in the right figure of Fig.5. Would you explain how you obtain the value of attention for each relation?

dujiaxin commented 3 years ago

I ran into the same question. But I think I figured it out by reading the code.

Besides referring to issue #27 , I pasted my script here.

For example, if you want to visualize the weight on target type 1, the weight of source type 2 would be: torch.matmul(self.att, self.att.T)[[torch.logical_and((node_type_j == 1), node_type_i == 2)]].sum()/torch.matmul(self.att, self.att.T)[node_type_j == 1].sum() The self.att is saved in conv.py

Maybe the author can double-check if I am right.

acbull commented 3 years ago

I ran into the same question. But I think I figured it out by reading the code.

Besides referring to issue #27 , I pasted my script here.

For example, if you want to visualize the weight on target type 1, the weight of source type 2 would be: torch.matmul(self.att, self.att.T)[[torch.logical_and((node_type_j == 1), node_type_i == 2)]].sum()/torch.matmul(self.att, self.att.T)[node_type_j == 1].sum() The self.att is saved in conv.py

Maybe the author can double-check if I am right.

Hi:

Yes, your answer is similar to how we get the average attention weight.

We don't directly use the Relation_weight to plot this tree, instead, we average the attention calculated by <s,e,t> over the nodes within several batches, as this snippet does.

smayru commented 3 years ago

Thank you for your prompt reply!