-
Hola,
Al intentar utilizar los pipeline de transformers en el caso de 'fill-mask' si que funciona bien:
`nlp_fill = pipeline('fill-mask', model="dccuchile/bert-base-spanish-wwm-cased", tokenizer="…
-
I've followed some examples in transformers with no success:
```
from transformers import AutoTokenizer, TFAutoModelForQuestionAnswering
import tensorflow as tf
tokenizer = AutoTokenizer.fro…
-
Hello, thank you so much for your sharing. But When I test the converted ERNIE model using [pytorch-transformers](https://github.com/huggingface/pytorch-transformers), the performance on cloze task is…
-
您好,RoBERTa-wwm-ext, Chinese模型也是用BertModel.from_pretrained加载吗?是不是Huggingface-Transformers上用bert模型和RoBERTa模型的代码是一样的,我在BertModel.from_pretrained载入RoBERTa-wwm-ext, Chinese模型就相当于用的是RoBERTa模型了吗?很不理解,希望得到您的解…
-
我理解你们在预训练时用的时LTP分词器。但是我用你的模型微调时也需要用LTP分词器么?我用transfomers这个库调用这个无论是你们的每一个模型的时候,要不就是还是用的和BERT base一样的基于字的分词器,要不就是说找不到模型对应的vocab.json。请问是Bug么?
-
Could you provide a sample of config for Question Answering Model for SQuAD, like this https://github.com/deepmipt/DeepPavlov/blob/master/deeppavlov/configs/squad/squad_bert_multilingual_freezed_emb.j…
-
# 📚 Migration
## Information
Model I am using (Bert, XLNet ...):
Language I am using the model on (English...):
The problem arises when using:
* [x] the official example scripts: (giv…
-
# ❓ Questions & Help
## Details
Hello. I'm going to do a fine-tuning of BERT-base-uncased using the QA Dataset I made. However, the following error occurs: Could you tell me how to solve thi…
-
# 🐛 Bug
## Information
Model I am using (Bert, XLNet ...): Bert
Language I am using the model on (English, Chinese ...): Chinese
The problem arises when using:
* [ ] the official example …
-
**Issue by [sleepinyourhat](https://github.com/sleepinyourhat)**
_Wednesday Jun 26, 2019 at 22:28 GMT_
_Originally opened as https://github.com/nyu-mll/jiant/pull/765_
----
Addresses half of #730. …