-
请问HFL Chinese RoBERTa-wwm-large-ext的模型是通过哪个脚本进行转换成UER格式呢?是convert_bert_from_huggingface_to_uer.py这个吗?
i wonder how can i convert HFL Chinese RoBERTa-wwm-large-ext model into UER style ?which script s…
-
如题,请问几个tnews 1.1版本测试集相关的问题:
1. 有没有BERT-base, BERT-wwm-ext, ERNIE-base, RoBERTa-large, XLNet-mid, ALBERT-base, ALBERT-large, ALBERT-xlarge, ALBERT-xxlarge, ALBERT-tiny, RoBERTa-wwm-ext, RoBERTa-wwm-…
-
404 Client Error: Not Found for url: https://huggingface.co//model/chinese-bert-wwm-ext/resolve/main/config.json
Traceback (most recent call last):
File "H:\Anaconda\envs\softmasked\lib\site-packa…
-
Hi,
When I try to run step 1s -- dataset formatting -- I get the following error:
```
vocal_Saya.wav.reformatted.wav_10.flac_0022832960_0022952000.wav
vocal_Saya.wav.reformatted.wav_10.flac_0022…
-
How do I use models that aren't in the specified list?
I would like to use this model:
https://huggingface.co/dccuchile/bert-base-spanish-wwm-uncased
How do I go about doing this?
Regards,…
-
from kashgari.corpus import ChineseDailyNerCorpus
from kashgari.embeddings import BertEmbedding
from kashgari.tasks.labeling import BiLSTM_CRF_Modelb
import kashgari
X_train, y_train = ChineseDa…
-
您好!我想请问一下您使用的预训练模型是chinese-bert-wwm还是robert呀?谢谢您!
-
理由:nezha模型据我们内部测试效果要好于bert-wwm-ext这一类的bert模型,同时据我们所知faster transformer也暂时不支持nezha,onnx-runtime对nezha的加速也是普通优化,并未达到bert这么快的程度
-
i used the bert_chinese_wwm pretrained config, but things go wrong that it has no attribute 'reduce_dim', i can't find your config file, can you share with me?
-
Hi, wondering if you can help...I.m facing an issue when loading the main.py
Traceback (most recent call last):
File "C:\CP\LocalAIVtuber\venv\lib\site-packages\transformers\utils\hub.py", line …