-
Dear Author,
I am trying to reproduce the rec performance on INSPIRED dataset.
![image](https://github.com/wxl1999/UniCRS/assets/60718476/4fb55643-4784-4995-8271-fa1479cd766b)
I use the hyper…
-
### Feature request
I would like to request that BetterTransformer not be deprecated. See also [optimum#2083](https://github.com/huggingface/optimum/issues/2083).
This issue is intended to track t…
-
what's matter please
Traceback (most recent call last):
File "train.py", line 90, in
student=config.create_student(nocrf=args.nocrf)#调用 config 的 create_student 方法来创建一个学生模型,指定是否启用 CRF
F…
-
提问时请尽可能提供如下信息:
### 基本信息
- 你使用的**操作系统ubuntu**:
- 你使用的**Python3.7**版本:
- 你使用的**Tensorflow1.14**版本:
- 你使用的**Keras2.3.1**版本:
- 你使用的**bert4keras0.10.6**版本:
- 你使用纯**keras**:
### 核心代码
```p…
-
Hello, thank you for releasing your code. When I want to reproduce your results, I only get 78.6 for RoBERTa_large. I noticed that you do not mention what pre-trained models do you use to retrieve neg…
-
I am trying to run an example from README.md and running into an error:
code:
```bash
TASK="BC5CDR-chem"
DATADIR="data/tasks/BC5CDR-chem.model=roberta-large.maxlen=512"
MODEL=roberta-large
M…
-
1. I want to do incremental pre-training on the existing RoBERTa. Which RoBERTa model should I use? Download directly from Hugging Face? Do I need to script it into UER format after downloading it? Is…
-
代码:model = AutoModelForSequenceClassification.from_pretrained(checkpoint)
运行报错缺少配置文件
ValueError: Unrecognized model in XXX. **Should have a `model_type` key in its config.json**, or contain one of t…
-
Hi, thanks for the great example on training RoBERTa with long attention.
Followed this example: https://github.com/allenai/longformer/blob/master/scripts/convert_model_to_long.ipynb
Was able to s…
-
@ArthurZucker I am trying to train a bytepiece tokenizer on my dataset. I have a list of words which I want to be treated as a single token. But when I train it and tokenize, I observe that the token …