-
D:\asdasd\AI\GPT-SoVITS-Server-main\GPT-SoVITS-Server-main>python server.py
DirectML可用,将使用DirectML进行推理加速。
设备名称: NVIDIA GeForce GTX 1650
Traceback (most recent call last):
File "D:\asdasd\AI\GPT-…
-
RuntimeError: mat1 and mat2 shapes cannot be multiplied (11264x1024 and 768x6)
当
```
'roberta': (
'transformers.BertTokenizer',
'transformers.RobertaModel',
'transforme…
-
Hello, I was wondering if we can replace the BERT for current supported models
```python
Languages.JP: BASE_DIR / "bert" / "deberta-v2-large-japanese-char-wwm",
Languages.EN: BASE_DIR / "…
-
bert-as-service can support bert-wwm-ext model released in https://github.com/ymcui/Chinese-BERT-wwm
-
我要做的项目是要求用roberta的,请问各位大佬我可以直接把chinese-roberta-wwm-ext对应的模型和vocab放在 bert_pretain目录下然后直接训练吗?
-
Hi, If I would like to apply the diversity selection methodology on Chinese SFT dataset (let's say Alpaca-cn-gpt4), can I simply change the model to a chinese-bert(https://huggingface.co/bert-base-chi…
nuoma updated
2 months ago
-
你的微盘里的chinese-bert-wwm 还是 chinese-bert-wwm-ext
-
我试了DUIE中文数据集代码能够跑通,用的chinese-bert-wwm-ext的预训练模型和中文vocab,但是换到bert-base-cased模型和英文vocab在英文数据集上训练时,就报错下面的错误,不知道啥原因,关系数目也改了,是代码中还有需要修改的地方吗,还是用源代码才能跑?
![LYBLK2ZDJSP0`32K(_R JM6](https://github.com/user-at…
-
Thanks for your work!
I can not find this file _train-embeddings-base-1gpu.json_ mentioned in _ReadMe.md_, but found _bert-wwm-ext_literature_ file. Does the _bert-wwm-ext_literature_ file replace…
-
【train】Epoch: 10/10 Step: 668/670 loss: 0.00103
【train】Epoch: 10/10 Step: 669/670 loss: 0.00121
【train】Epoch: 10/10 Step: 670/670 loss: 0.00136
[eval] precision=0.0341 recall=0.5020 f1_score=0.0639…