-
Dear Authors,
Thanks for open sourcing the code. I tried pretrain 100k steps and finetune on vqav2, but my dev-test score is about 65, unlike the 70.8 on the paper.
Here is my pretrain and finetu…
-
I'm sorry to bother you again.
I wanna know whether the codes of paper ( ' A BERT-based two-stage model for Chinese Chengyu recommendation ' about two-stage) are only using ' train_pretrain.py ' an…
-
**Question**
I was wondering if it works with Spanish document databases?
**Additional context**
Also is it possible to use mongodb?
-
## Environment info
- `transformers` version: 4.4.dev0
- Platform: Ubuntu 18
- Python version: 3.7
- PyTorch version (GPU?): 1.7.1 (YES)
- Tensorflow version (GPU?):
- Using GPU in script?: …
-
When I run the membertret, I dont find several files.
1、/home/zhengchujie/bert_torch/chinese_wwm_pytorch/bert_config.json&vocab.txt&pytorch_model.bin
As a result, I download an alternative in http…
-
-
test.sh运行后读取ERNIE模型,我在https://github.com/nghuyong/ERNIE-Pytorch下载了ernie-1.0 (Chinese)模型,并将解压到文件放到bert/torch_ernie_1/下面,运行还是报错。请问你是在哪里下载的ERNIE模型,并做了什么处理呢?谢谢
-
```python
Weights from pretrained model not used in BertForPreTraining:
['bert.encoder.layer.0.attention.self.relative_positions_encoding.positions_encoding',
'bert.encoder.layer.1.attention.se…
-
03/16/2021 16:09:53 - INFO - transformers.modeling_utils - loading weights file /home/lab/Desktop/xf_event_extraction2020Top1-master/bert/torch_roberta_wwm/pytorch_model.bin
03/16/2021 16:16:01 - I…
-
您好,我在用您的chinese-roberta-wwm-ext-large模型做MLM任务时发现好像有bug。我分别尝试过google/bert的inference代码以及huggingface的Transformers工具的inference代码,好像都有明显的问题。以下是调用Transformers的代码:
```python3
from transformers import *…