harvardnlp / seq2seq-attn

Sequence-to-sequence model with LSTM encoder/decoders and attention
http://nlp.seas.harvard.edu/code
MIT License
1.26k stars 278 forks source link

code for seq-level distillation #100

Open Oneplus opened 6 years ago

Oneplus commented 6 years ago

Hi Yoon,

As mentioned in the Sequence-Level Knowledge Distillation, implementation of the distillation model is released in this repo, but I didn't find the corresponding code (both word-level distillation and seq-level distillation). Is it still under construction or there is anything I've missed. Please advise.

Regards,

LiangQiqi677 commented 3 years ago

I didn't find, too.