-
ValueError: Couldn't find 'checkpoint' file or checkpoints in given directory chinese_roberta_wwm_ext_L-12_H-768_A-12
尝试用 https://github.com/huggingface/pytorch-transformers/blob/master/pytorch_tra…
-
请问bert-wwm-ext有对下游任务的结果汇报吗
我看到readme.md里提供了一些但是不完整,请问有完整的汇报吗
以及会有ext的论文发出来吗,谢谢
-
From [here](https://arxiv.org/pdf/1906.08237.pdf) on page 16, it seems we should set Layer-wise lr decay to 0.75. However, I didn't find a way to do so in `run_squad.py`. Could someone provide a sampl…
-
您好,我有个疑问,就是roberta原文中采用的是BPE Tokenizer,然而好像SCIR中的roberta-wwm好像仍然采用的是bert的一系列(Bertconfig, BertTokenizer,就是hugginface pytorch_tranformer里面的类),所以模型的整体依旧是bert-wwm,只是训练的方式仿照roberta中训练的吗
-
在使用pytorch_transfomers加载chinese_roberta_wwm_ext_pytorch报错。( 使用【RobertaConfig, RobertaForSequenceClassification, RobertaTokenizer】加载)
-
bert-wwm-ext就可以正常使用,这个不可以。
-
请问有跟刚发布的RoBERTa-wwm-ext对比的计划吗?
-
如题。
不胜感激!
-
## 🐛 Bug
Model I am using (Bert, XLNet....): BERT base uncased
Language I am using the model on (English, Chinese....): English
The problem arise when using:
* [x] the official example scrip…
-
## ❓ Questions & Help
I dont know if this project can select the version of BERT which I need. For example, i want use BERT-wwm not BERT-basic, what should i do? Can you help me, plz.