Open ArjunParthasarathy opened 3 years ago
@AlanAboudib this adds support for a BERT encoder and iterator, specifically for the Masked LM use case.
Now requires HuggingFace Transformers library to be installed.
I used a structure very similar to the BPTT Example Notebook to verify that my encoder and iterator work in training a BERT model. I was able to get good test and validation scores for my trained model on the Wikitext-2 dataset.
Check out this pull request on
See visual diffs & provide feedback on Jupyter Notebooks.
Powered by ReviewNB
Description
@AlanAboudib this adds support for a BERT encoder and iterator, specifically for the Masked LM use case.
Affected Dependencies
Now requires HuggingFace Transformers library to be installed.
How has this been tested?
I used a structure very similar to the BPTT Example Notebook to verify that my encoder and iterator work in training a BERT model. I was able to get good test and validation scores for my trained model on the Wikitext-2 dataset.