DevSinghSachan / multilingual_nmt

Experiments on Multilingual NMT
47 stars 8 forks source link

Could you please share the pretrained models? #1

Closed hhxxttxsh closed 5 years ago

hhxxttxsh commented 6 years ago

I am interested in your experiments since you have trained the transformer model using the TED dataset. Since I have very limited training resources, could you please share your pretrained model with me for the task of English to German?

Thanks!

DevSinghSachan commented 6 years ago

Hi, Thanks for your interest in our work. While it may be difficult for me to provide the pre-trained model files due to the large number of experiments, you can train your own En->De model on TED talks dataset using the following script: https://github.com/DevSinghSachan/multilingual_nmt/blob/master/tools/bpe_pipeline_en_de.sh

You can run this script from the main directory. Typically, a 6 layer Transformer model takes around 6-8 hours to converge on a modern Nvidia GeForce TitanX. Let me know in case you have further questions!

hhxxttxsh commented 6 years ago

Sorry, I guess I did not make it clear. I have got no GPU at all .... Any pre-trained model would be OK, and it does not have to have the best performance.

On Tue, 18 Sep 2018 at 15:23, Devendra Singh Sachan < notifications@github.com> wrote:

Hi, Thanks for your interest in our work. While it may be difficult for me to provide the pre-trained model files due to the large number of experiments, you can train your own En->De model on TED talks dataset using the following script:

https://github.com/DevSinghSachan/multilingual_nmt/blob/master/tools/bpe_pipeline_en_de.sh

You can run this script from the main directory. Typically, a 6 layer Transformer model takes around 6-8 hours to converge on a modern Nvidia GeForce TitanX. Let me know in case you have further questions!

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/DevSinghSachan/multilingual_nmt/issues/1#issuecomment-422413196, or mute the thread https://github.com/notifications/unsubscribe-auth/ASsNxiiNONFx2IN8Tbh6irDo8ucoTCHKks5ucQHdgaJpZM4Wtk0F .

-- Best Regards,

Yours Sincerely Ruomei