fishmoon1234 / DAG-GNN

Apache License 2.0
288 stars 70 forks source link

KL loss in 340th line, train.py #7

Open nanguoshun opened 4 years ago

nanguoshun commented 4 years ago

Hi @fishmoon1234 . Thanks for your code.

I would like to confirm with you about the KL loss defined in line 340, train.py.

I am confused about why you compute KL loss with _KL_guassiansem when MLPs are used as encoder and decoder. As MLP and Sem are two options for encoder and decoder.

Further, it seems that the implementation of _KL_guassiansem here doesn't align with the equation (9). It simply uses the multiplication of the vectors with *mu mu**

Looking forward to your suggestions.

Thanks a lot!

HantaoShu commented 4 years ago

Same question.

fishmoon1234 commented 4 years ago

Thank you for asking. There were a sem encoder implemented to test the algorithm, but not updated since we only use mlp in encoder/decoder. Therefore, it is very possible that the sem encoder (an old version) doesn't work any more. I will update the sem encoder in this code once I get time, if you are interested in that.

ItsyPetkov commented 3 years ago

Did anybody find a solution to this problem?