Closed zrg1993 closed 3 years ago
@zrg1993 We implement the model following the original author in knowledge_graph_attention_network, and the A_in
matrix is updated every epoch, as is the same in KGATTrainer
.
*********************************************************
Alternative Training for KGAT:
... phase 2: to train the KGE method & update the attentive Laplacian matrix.
"""
if args.model_type in ['kgat']:
n_A_batch = len(data_generator.all_h_list) // args.batch_size_kg + 1
if args.use_kge is True:
# using KGE method (knowledge graph embedding).
for idx in range(n_A_batch):
btime = time()
A_batch_data = data_generator.generate_train_A_batch()
feed_dict = data_generator.generate_train_A_feed_dict(model, A_batch_data)
_, batch_loss, batch_kge_loss, batch_reg_loss = model.train_A(sess, feed_dict=feed_dict)
loss += batch_loss
kge_loss += batch_kge_loss
reg_loss += batch_reg_loss
if args.use_att is True:
# updating attentive laplacian matrix.
model.update_attentive_A(sess)
And I found a problem in my previous reply in #1001 , A_in
is attentive laplacian matrix. For model details, you can refer to the author's source code in this repo. If you still have questions about the attention mechanism of the model, it is recommended that you ask questions here.
@Sherry-XLL Thank you very much. I will check the implemention from KGAT authors.
https://github.com/RUCAIBox/RecBole/blob/b7b29954fbc963a406ddbe2f917b4ba56bd7a22b/recbole/model/knowledge_aware_recommender/kgat.py#L156
I read the paper and code in RecBole. Now I have several questions to confirm. Hope you can hlep me make them clear.
A_in
is only calculated when the model initialize and is static for each aggregation layer ? The reason why I have this question is for the GAT model, for each layer we have a different learnable weight matrix to calculate the attention scores. However in the KGAT paper, it seems there is only oneA_in
matrix and no learnable weights for calculatingA_in
.A_in
matrix ? I read the implementation in RecBole, the model will update theA_in
when one epoch finished. Is my understanding right ?