guillaumegenthial / tf_ner

Simple and Efficient Tensorflow implementations of NER models with tf.estimator and tf.data
Apache License 2.0
923 stars 275 forks source link

Word Embedding #22

Open aggounix opened 6 years ago

aggounix commented 6 years ago

Is it possible to use an other word embedding than glove (for other languages)?

QianhuiWu commented 5 years ago

Is it possible to use an other word embedding than glove (for other languages)?

I think it's okay. You just need to change the path of the embedding file in build_glove.py.

mraduldubey commented 5 years ago

Yes. That'd be fine @aggounix . I have used fasttext instead of glove. And even more, I have used quantized embeddings to reduce memory requirement.

ghost commented 5 years ago

Anyone tried the feature extractor form BERT repo? Was thinking about using the last two layers of BERT concat them and use the 1548 long vector insted of the glove ones.