-
我使用以下代码
`from transformers import BertTokenizer, BertModel
bert = BertModel.from_pretrained("hfl/chinese-roberta-wwm-ext")
bert_tokenizer = BertTokenizer.from_pretrained("hfl/chinese-roberta-ww…
-
**Describe the bug**
I'm have trained a model with this definition on my GPU.
![image](https://user-images.githubusercontent.com/55831555/81293330-6c3d1980-906d-11ea-9e5b-50eb5b231413.png)
The …
-
我感觉electra的modeling即是bert的modeling,预训练的优化在于优化目标;我想问下electra的run_finetuning.py能不能直接加载bert,bert-wwm,robertra等等的模型;目前加载似乎不太合适,主要修改那个代码块?
-
## Environment info
- `transformers` version: 3.4.0
- Platform: Linux-4.15.0-122-generic-x86_64-with-Ubuntu-18.04-bionic
- Python version: 3.6.9
- PyTorch version (GPU?): 1.5.1+cpu (False…
-
# ❓ Questions & Help
![image](https://user-images.githubusercontent.com/15426714/87268409-dd30ff80-c4fc-11ea-9e34-4d1c06dc1d23.png)
-
I run the script from https://github.com/ray-project/ray/tree/master/python/ray/tune/examples/pbt_transformers,
I find a core file in script directory:
![image](https://user-images.githubuserconte…
-
# ❓ Questions & Help
## Details
```
> from transformers import AutoTokenizer, AutoModelWithLMHead
> tokenizer = AutoTokenizer.from_pretrained("hfl/chinese-roberta-wwm-ext")
I0710 17:52:53…
-
您好:
感谢您提供预训练模型。想请教 BERT-wwm 在进行预训练时,使用的中文维基,是简体中文,还是繁体中文,还是两者都有?
-
**Rasa version**:
2.1.0
**Rasa SDK version** (if used & relevant):
**Rasa X version** (if used & relevant):
**Python version**:
3.7
**Operating system** (windows, osx, ...):
osx
…
-
WSC模型使用Chinese-wwm-bert预训练模型accuracy只有50%,看发布的基线模型有62%左右,大家有遇到这类精度相差很大的情况吗?求教