Closed ypc-stu closed 1 year ago
Good question. I planned to incorporate those various pretrained embeddings half a year ago but couldn't find enough time to implement it. I am not sure when can I release it, but I will definitely work on this.
Hi, I'm also thinking of using BERT embedding on NCRF++. Since I saw someone already requested this, just checking in to know if it's implemented?
@CHENPoHeng Hi , we have implemented an initial version of integrating different BERT (using hugging face) in the dev version. But we haven't evaluated it. We plan to have a detailed evaluation and merge it into the master branch.
hi everyone, is it possible to use contextual word embedding (Bert, Elmo,..) I have worked on sequence tagging using NCRF ++ with default word embedding, I am wondering how to use Bert there? Many thanks
Hello everyon, I know it is too late but we have update NCRF++ to YATO(https://github.com/jiesutd/YATO) which can fully utilize large pretrained language models.
Hello, I want to introduce a pre-training vector (trained with bert) in demo.train.config file, word_emb_dir=xxx(bert), where does NCRF++ need to be changed? Or when do you release bert embedding?