rstriv / Know-Evolve

Implementation code for ICML '17 paper "Deep Temporal Reasoning for Dynamic Knowledge Graphs"
108 stars 22 forks source link

Relational embeddings initialization #8

Open JacquesEverwyn opened 5 years ago

JacquesEverwyn commented 5 years ago

Dear authors,

I'm trying to implement your framework in TensorFlow and have a few questions about how you deal with relational embeddings.

I understand that in the bilinear formulation, there is one relationship weight matrix "R" per relation, trained during backpropagation. But how does relational embeddings "r" work? You say in the article that these embeddings are static: it mean that they don't evolve like the entities, but are they trained during backpropagation? Do you initialize them at zero like the entities?

In the code, you fetch the last existing embedding if it exists or you get the embedding parameters if not.

latest_subject_rel_embed = GetEmbeddingParam("rel", cfg::num_rels, e->rel, inputs, param_dict["w_rel_init"], lookup_rel_onehot, lookup_rel_init);

It is not quite clear for me what this line does. I understand that you use the one hots and the parameters w_rel_init to fetch the embedding, but what do you fetch exactly? The line of w_rel_init that embed the current relation, using the one hot layer? And is w_rel_init the Wr mentioned in Section 4?

Thanks in advance for your answer.

rstriv commented 5 years ago

Hello,

Thank you for you query. Yes, w_rel_init is the W_r and the parameter corresponding to the static relation embedding. The line you quoted gets the embedded relation corresponding the one-hot relation input but using the updated parameters. So, yes we train the relation embedding parameter w_rel_init via backprop, however we do not store or update the relation embedding vector themselves after every event like we do for entities using latest_embeddings vectors.