ajitrajasekharan / unsupervised_NER

Self-supervised NER prototype - updated version (69 entity types - 17 broad entity groups). Uses pretrained BERT models with no fine tuning. State-of-art performance on 3 biomedical datasets
MIT License
80 stars 20 forks source link

Using for other language? #2

Open SuPanther opened 4 years ago

SuPanther commented 4 years ago

Can I achieve this unsupervised method of NER using multi BERT for other languages?

ajitrajasekharan commented 4 years ago

Yes, it should be possible. We just need a model training on the language. Ideally pretrained with a custom vocab on a corpus that has entities of interest to us but not required.

Ajit

On Sep 4, 2020, at 11:34 PM, Su Myat Thitsar notifications@github.com wrote:

Can I achieve this unsupervised method of NER using multi BERT for other languages?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/ajitrajasekharan/unsupervised_NER/issues/2, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJAJMNUAS6RH73Y3WJTSUA3SEGWVXANCNFSM4Q2GBEJA.