-
您好,RoBERTa-wwm-ext, Chinese模型也是用BertModel.from_pretrained加载吗?是不是Huggingface-Transformers上用bert模型和RoBERTa模型的代码是一样的,我在BertModel.from_pretrained载入RoBERTa-wwm-ext, Chinese模型就相当于用的是RoBERTa模型了吗?很不理解,希望得到您的解…
-
Could you provide a sample of config for Question Answering Model for SQuAD, like this https://github.com/deepmipt/DeepPavlov/blob/master/deeppavlov/configs/squad/squad_bert_multilingual_freezed_emb.j…
-
# 📚 Migration
## Information
Model I am using (Bert, XLNet ...):
Language I am using the model on (English...):
The problem arises when using:
* [x] the official example scripts: (giv…
-
**Issue by [sleepinyourhat](https://github.com/sleepinyourhat)**
_Wednesday Jun 26, 2019 at 22:28 GMT_
_Originally opened as https://github.com/nyu-mll/jiant/pull/765_
----
Addresses half of #730. …
-
# ❓ Questions & Help
## Details
Hello. I'm going to do a fine-tuning of BERT-base-uncased using the QA Dataset I made. However, the following error occurs: Could you tell me how to solve thi…
-
大佬,怎么把别处下载的模型,载入到transformer代码里呢,我看到你的用的不是默认自带的模型。
-
# 🐛 Bug
## Information
Model I am using (Bert, XLNet ...): Bert
Language I am using the model on (English, Chinese ...): Chinese
The problem arises when using:
* [ ] the official example …
-
之前bert-wwm可以改善原先bert預訓練mask單個字的問題,全詞遮蔽(wwm)可以使模型學到更多詞與詞的關係。
目前這一個版本的electra在預訓練的時候是否也有使用全詞遮蔽(wwm)?
-
I tried to use chinese roberta model according to this url https://huggingface.co/hfl/chinese-roberta-wwm-ext
But it does not work and raises error like this
-
InvalidArgumentError (see above for traceback): Assign requires shapes of both tensors to match. lhs shape= [768] rhs shape= [1024]
[[Node: save/Assign_508 = Assign[T=DT_FLOAT, _class=["loc:@bert/e…