FranxYao / dgm_latent_bow

Implementation of NeurIPS 19 paper: Paraphrase Generation with Latent Bag of Words
MIT License
123 stars 14 forks source link

wiki2bio/original_data/word_vocab.txt #1

Open yiyibooks opened 5 years ago

yiyibooks commented 5 years ago

Hi,

An error was throwed when I ran the main.py: FileNotFoundError: [Errno 2] No such file or directory: '../wiki2bio/original_data/word_vocab.txt'

It seems that the file wiki2bio/original_data/word_vocab.txt is not included.

FranxYao commented 5 years ago

Oh that is for testing the model on the Wikibio data to text generation task, and not included in the paper. If you need this part could you send me an email so I can send them to you? yao.fu@columbia.edu

yiyibooks commented 5 years ago

Thank you very much!

In fact, I tried to Google wiki2bio later, and found this one https://github.com/tyliupku/wiki2bio. They should be the same dataset, right?

FranxYao commented 5 years ago

Yep, that's right. With the current code I think you can get it run. But during my test, the training was unstable and may corrupt in the second epoch(loss become nan). I did not take a close look but you may figure this out, with different initialization or smaller learning rate.

On Mon, Nov 11, 2019, 15:24 yiyibooks notifications@github.com wrote:

Thank you very much!

In fact, I tried to Google wiki2bio later, and found this one https://github.com/tyliupku/wiki2bio. They should be the same dataset, right?

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/FranxYao/dgm_latent_bow/issues/1?email_source=notifications&email_token=AEHHCHNATSQPIRZQQYJ3PPDQTECCZA5CNFSM4JJR5WA2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEDV43PY#issuecomment-552324543, or unsubscribe https://github.com/notifications/unsubscribe-auth/AEHHCHNOVGHEV2KHWQHTWV3QTECCZANCNFSM4JJR5WAQ .