wavewangyue / ner

命名实体识别实践与探索
706 stars 128 forks source link

Did you use pretraining character embedding in BiLSTM+CRF model? #3

Closed one-tree closed 3 years ago

wavewangyue commented 3 years ago

Tried. But there is no improvement with pretraining character embedding. Maybe because compared with word vocab, character vocab size is too small, in Chinese. But pretraining word embedding is very useful, as shown in BiLSTM+CRF+WLF