dmlc / gluon-nlp

NLP made easy
https://nlp.gluon.ai/
Apache License 2.0
2.56k stars 538 forks source link

Want to add bi-lstm CRF model #100

Open fierceX opened 6 years ago

fierceX commented 6 years ago

Recently engaged in NLP work, has been concerned about mxnet, want to add bi-lstm CRF model, TF already has a CRF module, Hope Mxnet and gluon as soon as possible. Thank you

szha commented 6 years ago

Indeed, CRF is useful in many decoding tasks.

Contributions are welcome!

zhiheng-huang commented 6 years ago

+1. LSTM/BLSTM + CRF/segmental CRF would be very useful.

lixin4ever commented 6 years ago

@szha I will handle this issue.

szha commented 6 years ago

Great, thanks @lixin4ever. Feel free to connect with @sxjscience and he can help out if you have some questions :)

szha commented 5 years ago

https://github.com/kenjewu/CRF provides a good starting point. I'm currently looking into hybridizing CRF model.

lixin4ever commented 5 years ago

Sorry for the delay. I can continue to do this after I finish my work (maybe at the end of December), if it still remains unresolved at that time.

hazelnutsgz commented 5 years ago

fastNLP (https://github.com/fastnlp/fastNLP/), a NLP project based on Pytorch may of great help. FYI

CRF

https://github.com/fastnlp/fastNLP/blob/master/fastNLP/modules/decoder/CRF.py https://fastnlp.readthedocs.io/en/latest/fastNLP.modules.decoder.html?highlight=CRF

BiLSTM + CRF

https://github.com/fastnlp/fastNLP/blob/master/fastNLP/models/sequence_modeling.py

bratao commented 5 years ago

@szha @lixin4ever Any update about the CRF support? It is a must have for many NLP applications

eric-haibin-lin commented 5 years ago

@kenjewu has a PR on this: https://github.com/dmlc/gluon-nlp/pull/466 but it's not merged yet. @kenjewu would you have some bandwidth to address the remaining review comments?

eric-haibin-lin commented 5 years ago

@bratao you might also be interested in https://github.com/dmlc/gluon-nlp/pull/612

vanewu commented 5 years ago

Ok, I will continue to track the PR.