voidism / L2KD

Code for the EMNLP2020 long paper "Lifelong Language Knowledge Distillation" https://arxiv.org/abs/2010.02123
12 stars 4 forks source link

Incomplete code files? #1

Open vaibhavvarshney0 opened 3 years ago

vaibhavvarshney0 commented 3 years ago

It seems some code files are missing from this repo like train.sh, test.sh etc. or am I missing something?

voidism commented 3 years ago

Hi, sorry that I overlooked this. I just uploaded the missing files in the latest commit 14a5df3.

vaibhavvarshney0 commented 3 years ago

Thanks. Also For distil to conduct Word-KD do we have to train respective teacher model separately and provide it?

voidism commented 3 years ago

Yes, we need to train a single-task teacher model separately before we perform Word-KD.