Open V-Enzo opened 5 years ago
Reconstruction loss (labelled "autoenc_loss" in the code) and similarity loss (labelled "cycle_loss") are defined in summarization.py.
From my understanding, 'enc' and 'rec' are two versions of similarity loss (also defined in summarization.py): the 'enc' version calculates the similarity loss based on the encoded summaries (what comes out the encoder), while the 'rec' version uses the decoded texts embeddings (what comes out the decoder). The final model described in the paper uses the 'enc' version, which is what the code seems to be doing by default by having self.cycle_loss='enc' in project_settings.py.
@sosuperic please let me know if I am mistaken.
@stepgazaille Thanks for your explanation!
Hi@sosuperic , According to the paper, when we back propagates the model, there should be an autoencoder reconstruction loss and average summary similarity loss, but I couldn't find the code part of autoenc_loss in train_sum.py(only see the backward part and the file address of the autoencoder is none in project_setting.py), please tell me where are they, thank you.
Another problem is, i think, cycle_loss should be the average similarity loss, but I find the parameter
in project_settings.py. now, I am confused about the meaning of cycle loss? Dose it mean we could only choose one of those two losses to backward or what is the actual meaning of the parameter. Looking forward to your reply! Thank you again.