Closed ShoubhikBanerjee closed 4 years ago
fine-tuning
I don't have the compute resources to pre-train BART (or most of the other big transformer models) from scratch and so most of my work is really focused on transfer learning and fine-tuning pre-trained models on custom datasets.
Thanx for your reply.
Thanx for your awesome works, I just wanted to know to know that your colab : 2020-05-23-text-generation-with-blurr.ipynb, is it fine tune or pre-training?