Open yassmine-lam opened 3 years ago
Hi,
I found in some blogs that fine-tuning a BERT model is better than extracting features from bert without fine-tuning and then train a neural network from scratch and they justify this by the fact that fine-tuning a pre-trained model require less labeled data for training than a model built from scratch is there someone who has tried to compare these two methods?
Thank u
Hi,
I found in some blogs that fine-tuning a BERT model is better than extracting features from bert without fine-tuning and then train a neural network from scratch and they justify this by the fact that fine-tuning a pre-trained model require less labeled data for training than a model built from scratch is there someone who has tried to compare these two methods?
Thank u