JulesBelveze / bert-squeeze

🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
https://julesbelveze.github.io/bert-squeeze/
78 stars 10 forks source link

Use Callback for 2 stage training in DeeBert #34

Open JulesBelveze opened 1 year ago

JulesBelveze commented 1 year ago

DeeBert models need to be fine-tuned in a two step fashion: first the final layer and then the ramps. The current implementation requires the user to do two different training. However, this can be achieved in one-shot using a pl.Callback, as done for TheseusBert.