Traceback (most recent call last):
File "main.py", line 71, in
train(args=args)
File "/home/sncdbs/net_disk/zn/bert_lstm_crf/python/bert-ner/bert_lstm_ner.py", line 650, in train
tf.e…
@pjox and I are working on a model trained with Roberta and using the BPE tokenizer, in particular [zeldarose](https://github.com/LoicGrobol/zeldarose) which uses slightly different special tokens.
…
Since that gluon-nlp already has very good tools for BERT and also that it has basic data processing for named entity recognition ready from https://github.com/dmlc/gluon-nlp/pull/466 , I wanted to bu…
Although there is an example of transformers in other repo (for sentiment analysis) and is easy to adapt it to other cases, I think for sequential tagging is a bit challenging due to the fact that Ber…
### Describe the bug
I keep running into Cuda OOM problems when training NER models using Transformer embeddings and biLSTM-CRF. The trainer can't even get through 1 epoch. It's weird since I used …
Hi Patrice,
This is what I raised on Mattermost but I thought it's good to have an issue to have the information together.
Since GROBID 0.5.4 it should use all of the available threads and we ha…
Hi,
Thank you for sharing.
I'm interested if you tried to use bert to improve the performance of JMEE.
I try to reproduce JMEE,but I can't achieve the result of paper.