D:\asdasd\AI\GPT-SoVITS-Server-main\GPT-SoVITS-Server-main>python server.py
DirectML可用,将使用DirectML进行推理加速。
设备名称: NVIDIA GeForce GTX 1650
Traceback (most recent call last):
File "D:\asdasd\AI\GPT-SoVITS-Server-main\GPT-SoVITS-Server-main\server.py", line 71, in
tokenizer = AutoTokenizer.from_pretrained(bert_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\wyc\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 926, in from_pretrained
return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\wyc\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\tokenization_utils_base.py", line 2200, in from_pretrained
raise EnvironmentError(
OSError: Can't load tokenizer for './pretrained/chinese-roberta-wwm-ext-large'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure './pretrained/chinese-roberta-wwm-ext-large' is the correct path to a directory containing all relevant files for a RobertaTokenizerFast tokenizer.
好不容易把依赖装完了,还是报错,我现在东西是这样放得
D:\asdasd\AI\GPT-SoVITS-Server-main\GPT-SoVITS-Server-main>python server.py DirectML可用,将使用DirectML进行推理加速。 设备名称: NVIDIA GeForce GTX 1650 Traceback (most recent call last): File "D:\asdasd\AI\GPT-SoVITS-Server-main\GPT-SoVITS-Server-main\server.py", line 71, in
tokenizer = AutoTokenizer.from_pretrained(bert_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\wyc\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 926, in from_pretrained
return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\wyc\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\tokenization_utils_base.py", line 2200, in from_pretrained
raise EnvironmentError(
OSError: Can't load tokenizer for './pretrained/chinese-roberta-wwm-ext-large'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure './pretrained/chinese-roberta-wwm-ext-large' is the correct path to a directory containing all relevant files for a RobertaTokenizerFast tokenizer.
好不容易把依赖装完了,还是报错,我现在东西是这样放得