Closed one-tree closed 3 years ago
Tried. But there is no improvement with pretraining character embedding. Maybe because compared with word vocab, character vocab size is too small, in Chinese. But pretraining word embedding is very useful, as shown in BiLSTM+CRF+WLF
Tried. But there is no improvement with pretraining character embedding. Maybe because compared with word vocab, character vocab size is too small, in Chinese. But pretraining word embedding is very useful, as shown in BiLSTM+CRF+WLF