utterworks / fast-bert

Super easy library for BERT based NLP models
Apache License 2.0
1.86k stars 341 forks source link

added freeze_transformer_layers optional parameter #195

Closed aaronbriel closed 4 years ago

aaronbriel commented 4 years ago

allows one to freeze all model parameters with a name starting with the data.model_type

zabir-nabil commented 4 years ago

Awesome, so initially the slow training I was facing was due to training the whole model.

aaronbriel commented 4 years ago

Awesome, so initially the slow training I was facing was due to training the whole model.

Possibly, although you should note that only training the classification layers may result in reduced accuracy. It's probably best to test both approaches and see if the trade-off between run-time and accuracy is worth it.