OanaMariaCamburu / e-SNLI

MIT License
158 stars 28 forks source link

Label Prepended #4

Closed JasonChan7 closed 4 years ago

JasonChan7 commented 4 years ago

Hi,

Thanks for the code releasing!

During the reproducing, I found a few problems in the code:

Looking forward to your reply!

OanaMariaCamburu commented 4 years ago

Hi,

Cheers, Oana

JasonChan7 commented 4 years ago

Hi,

  • I used the "teacher" mode (for training) with a set of files that were prepending the groud-truth label, the _label files here https://github.com/OanaMariaCamburu/e-SNLI/tree/master/dataset/eSNLI. While the forloop mode was appending the prediction itself, hence it looks like there was an incosistency but it wasn't the case in practice. I agree this was not the best way to code it, but it was what I quickily put together at the time :) Sorry for the confusion! Also, hope i didn't change it in the meantime since I remember afterwards I wanted to clean it and hopefully I didn't just change it half way.
  • The label is not meant to be predicted by the decoder (it could be but it wasn't the goal in that experiment) but it should be taken from the prediciton of the MLP classifier and fed to the decoder so that the decoder conditions the explanation on it, that's why the prediction of the label by the decoder is not further fed.
  • I made an option to switch between either using only the concatenation of the premise and hypothesis or to also append their prod and diff: https://github.com/OanaMariaCamburu/e-SNLI/blob/master/seq2seq/models_esnli_init.py#L92 Hence depending on that flag one can test the two. I recall there wasn't any significant difference and the reported results should be with the version mentioned in the paper.

Cheers, Oana

Okay, I have found your update of the dataset. I was confused since there wasn't the train file with the _label postfix previously. There is no inconsistency now.

Thanks a lot! Zhen