-
请问一下能用bert-base-chinese模型里面自带的词表吗?请问一下您的词表是怎么来的呀?是用训练集分字得到的吗
-
bert-serving-start -model_dir chinese_L-12_H-768_A-12 -num_worker=1 #-http_port 8001
安装这个报错
I:VENTILATOR:[__i:__i: 67]:freeze, optimize and export graph, could take a while...
E:GRAPHOPT:[gra:opt…
-
* [The Illustrated GPT-2 (Visualizing Transformer Language Models) – Jay Alammar – Visualizing machine learning one concept at a time.](https://jalammar.github.io/illustrated-gpt2/)
* [完全图解GPT-2:看完…
-
Hi, thank you for your wonderful work!
I tried UDA without any augmentation on my text classification task, I can only get 93% accuracy while BERT can get 96% accuracy with the same steps and learni…
-
hi大佬,请问下这边几个文件是不是没有上传呀
self.dict_path = 'E:/bert_weight_files/roberta/vocab.txt'
self.config_path='E:/bert_weight_files/roberta/bert_config_rbt3.json'
self.checkpoint_path='…
-
from transformers import BertTokenizer, TFBertForSequenceClassification
import tensorflow as tf
tokenizer = BertTokenizer.from_pretrained('./bert-base-chinese')
model = TFBertForSequenceClassific…
-
### Is there an existing issue for this?
- [X] I have searched the existing issues and did not find a match.
### Current Behavior
`BertEmbeddings.pretrained() `can load successfully.
But whe…
-
hi @nreimers , it's a nice repo. When I read your code in training_stsbenchmark_bilstm.py. I want to test the performance of bert + bilstm, but maybe there is a bug in lstm. I have read all issues abo…
-
Hi, thank you for your great work.
distiluse-base-multilingual-cased has one more dense layer compared to the pool-only model. How is this dense layer added?
We are constructing a Chinese long text…
-
这个文件自己从谷歌下载,直接放在项目下可以用吗,需不需要处理三个.cpkt文件呢?
谢谢