cooelf / SemBERT

Semantics-aware BERT for Language Understanding (AAAI 2020)
https://arxiv.org/abs/1909.02209
MIT License
286 stars 55 forks source link

Use pre-trained SemBERT as a sentence encoder? #24

Closed drussellmrichie closed 2 years ago

drussellmrichie commented 3 years ago

Hi, thanks for writing this very interesting paper and for providing this nice GitHub repo. :smile: I was wondering if it's possible -- and more to the point, straightforward! -- to use a SemBERT that you have trained to just get a sentence representation, which I can then use in my own neural network for classification or regression? I have a classification (or maybe ordinal regression...) task for which SRL features might be useful over and above BERT.

cooelf commented 2 years ago

Yes, in our tasks, SemBERT is actually used as the sentence encoder for classification and regression.