-
Hi! I am trying to reproduce the Spanish part, but there have a ValueError: 'anticip' is not in list. I have already download bert-base-spanish-wwm-uncased.
Could you please provide the suggests fo…
-
Hi,
The transformer models, like "bert-base-multilingual-uncased" could be introduced in "minicons" help compute taken surprisal or probabilities for different languages if we have a text
of th…
-
In reference to #535 I get "Illegal instruction (core dumped)", and python prompt exit.
I have installed pytorch through the Anaconda package manager for python 3.6 by running the command
`conda …
-
**Rasa version**:
2.8.1
**Rasa SDK version** (if used & relevant):
2.8.1
**Rasa X version** (if used & relevant):
**Python version**:
3.7.10
**Operating system** (windows, osx, ...)…
-
### Feature request
Adaptive pretraining methods like domain adaptive pretraining and task adaptive pretraining can benefit downstream tasks, which is illustrated in [https://aclanthology.org/2020.ac…
-
Hi Fluid team, I am trying to follow the serverless demo [here](https://github.com/fluid-cloudnative/fluid/blob/master/docs/zh/samples/knative.md).
Instead of using data from web, my dataset is fro…
-
报错说没有bert-wwm-uncased,麻烦大佬给解答下,谢谢
-
您好,波哥,我想在关系抽取模型上添加对抗训练方法,预训练模型使用的是ernie模型,请问需要改预训练模型的embding层嘛,因为我换了预训练模型后使用对抗训练没有效果
[
![image](https://user-images.githubusercontent.com/66951248/191635173-98919f4f-0c68-4072-b4cd-4157c963a38a.png)…
-
在下游任务NER上,直接对 huggingface 下载下来的基于 PyTorch 的中文整词掩码预训练模型 (chinese-bert-wwm-ext, chinese-roberta-wwm-ext, chinese-roberta-wwm-ext-large等) 进行微调,能得到不错的结果。现在想先在下游任务的语料上做无监督的 adaptive fine-tuning (相当于用下游任务的数…
-
### System Info
```shell
transformers 2.5.1
python3.8
pytorch 1.10.2
All packages installed with conda by way of the conda-forge or powerai repositories, all of t…