Saber is a deep-learning based tool for information extraction in the biomedical domain. Pull requests are welcome! Note: this is a work in progress. Many things are broken, and the codebase is not stable.
The PyTorch Transformer library recently added a new AutoModel API, which lets you instantiate one of the many pre-trained transformers that are available (BERT, GPT-2, RoBERTa, etc.).
We should switch from our BERT-specific code to AutoModel. Specifically, this will involve swapping all BertModels, BertConfigs, and BertTokenizers for AutoModel, AutoConfig and AutoTokenizer. In the long run, this will let us seamlessly load pre-trained weights from any of the popular transformer language models.
This should be addressed in two pull requests:
Simply use AutoModel, AutoTokenizer and AutoConfig in place of the BERT specific classes. This won't be enough to use any of the transformers weights as our preprocessing steps are still BERT specific, but it will let us use any of the models that have the same preprocessing steps as BERT (like RoBERTa).
Re-write up our preprocessing steps to make them model agnostic.
The PyTorch Transformer library recently added a new
AutoModel
API, which lets you instantiate one of the many pre-trained transformers that are available (BERT, GPT-2, RoBERTa, etc.).We should switch from our BERT-specific code to
AutoModel
. Specifically, this will involve swapping allBertModel
s,BertConfig
s, andBertTokenizer
s forAutoModel
,AutoConfig
andAutoTokenizer
. In the long run, this will let us seamlessly load pre-trained weights from any of the popular transformer language models.This should be addressed in two pull requests:
AutoModel
,AutoTokenizer
andAutoConfig
in place of the BERT specific classes. This won't be enough to use any of the transformers weights as our preprocessing steps are still BERT specific, but it will let us use any of the models that have the same preprocessing steps as BERT (like RoBERTa).