Cartus / AGGCN

Attention Guided Graph Convolutional Networks for Relation Extraction (authors' PyTorch implementation for the ACL19 paper)
MIT License
433 stars 88 forks source link

关于AGGCN模型细节的问题 #35

Closed Youoo1 closed 2 years ago

Youoo1 commented 3 years ago

师兄您好! 看了AGGCN那部分封装的模型代码,有个地方不了解。 在gcn层中定义的MoedlList,其中包含4个子层。分别是两个GraphConvLayer和两个MultiGarphConvLayer。因此我没明白为啥只有后两层才使用注意力机制?我理解的是不是对每条文本都要使用注意力机制生成全连接加权图吗?

Cartus commented 2 years ago

Hi, we provided motivation for such designs in our paper at the end of Section 2.2 Attention Guided Layer:

In practice, we treat the original adjacency matrix as an initialization so that the dependency information can be captured in the node representations for later attention calculation. The attention guided layer is included starting from the second block.