zhuchen03 / FreeLB

Adversarial Training for Natural Language Understanding
250 stars 41 forks source link

Could you add some comments in the code? #1

Open PengboLiu opened 4 years ago

PengboLiu commented 4 years ago

It's hard to understand the code, including bash shell.

zhuchen03 commented 4 years ago

Hi Pengbo,

Sorry for not writing enough comments. I just added comments to the hyperparemters used in fairseq-RoBERTa/launch/FreeLB/mnli-fp32-clip.sh and huggingface-transformers/launch/run_glue.sh, so that you can read the code starting from these scripts...

fairseq is more convolved, but I think it should be much easier to read the code of Huggingface's transformers. The algorithm is all included in huggingface-transformers/examples/run_glue_freelb.py, plus some modification for the dropout mask in the ALBERT model. fairseq includes our implementations for FreeAT and YOPO, but will take more time to read.

I will add more comments to the code soon!