jiesutd / NCRFpp

NCRF++, a Neural Sequence Labeling Toolkit. Easy use to any sequence labeling tasks (e.g. NER, POS, Segmentation). It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components.
Apache License 2.0
1.89k stars 446 forks source link

about bert #144

Closed ypc-stu closed 1 year ago

ypc-stu commented 5 years ago

Hello, I want to introduce a pre-training vector (trained with bert) in demo.train.config file, word_emb_dir=xxx(bert), where does NCRF++ need to be changed? Or when do you release bert embedding?

jiesutd commented 5 years ago

Good question. I planned to incorporate those various pretrained embeddings half a year ago but couldn't find enough time to implement it. I am not sure when can I release it, but I will definitely work on this.

CHENPoHeng commented 4 years ago

Hi, I'm also thinking of using BERT embedding on NCRF++. Since I saw someone already requested this, just checking in to know if it's implemented?

jiesutd commented 4 years ago

@CHENPoHeng Hi , we have implemented an initial version of integrating different BERT (using hugging face) in the dev version. But we haven't evaluated it. We plan to have a detailed evaluation and merge it into the master branch.

myeghaneh commented 3 years ago

hi everyone, is it possible to use contextual word embedding (Bert, Elmo,..) I have worked on sequence tagging using NCRF ++ with default word embedding, I am wondering how to use Bert there? Many thanks

jiesutd commented 1 year ago

Hello everyon, I know it is too late but we have update NCRF++ to YATO(https://github.com/jiesutd/YATO) which can fully utilize large pretrained language models.