-
When trying to run train.py for custom dataset, I got a type error. The custom dataset has been prepared properly as mentioned in the description. I am using Windows OS and my python version is 3.9 . …
-
Dear Admin,
I found a problem when I test with one sequence after training process:
Traceback (most recent call last):
File "main.py", line 465, in
decode_results, pred_scores = load_mode…
-
Hello,
The BERT-BiLST-CRF model runs about 28 epochs (including multiple patience epochs) and stopped at the best F1 value = 0.760492. While your results show it's score more than 90.
`INFO:__m…
geo47 updated
3 years ago
-
Hello Zhong
First I appreciate the contribution,
Previously I have used RNNSharp successfully for NER task, I am curious to apply this new version too, however, when I try to run it generates huge …
-
训练的模型是识别模型:CRNN(mobilenetv3_small+bilstm+ctc)
paddle转onnx命令:
`paddle2onnx --model_dir inference/rec_crnn/ --model_filename inference.pdmodel --params_filename inference.pdiparams --save_file on…
-
I run my code with keras_contrib without any problem, but now i want to retrain my code with tensorflow2 and use this code for crf.
and get this error:
`ValueError: The last dimension of the input…
mj125 updated
3 years ago
-
Hey @amaiya, hoping you can help me refactor my NER app so it works with ktrain v0.26.x.
Background: in February 2021, I had a Streamlit app that was using a model.h5, model.json, and preproc.sav …
-
Hello,
With the given parameter, I applied CharCNN and concate with BERT embedding alongside POS embeddings. However, CharCNN concatination gives dimension error.
```
File "ntagger/model.py", l…
geo47 updated
3 years ago
-
Dear colleagues,
I'm impressed with your results described in the article.
I would really like to use your pipeline for our results obtained from Arabidopsis thaliana DNA ONT sequencing.
I’m tryin…
-
意自定义网络实现bilstm+attention机制实现文本分类,官网中只有视频的lstm+attention(PaddleHub-release-v1.8\PaddleHub-release-v1.8\hub_module\modules\video\classification\videotag_tsn_lstm\resource\models\attention_lstm\lstm_atte…