huahuahuage / Bert-VITS2-Speech

Bert-VITS2 onnx推理版本
GNU General Public License v3.0
37 stars 7 forks source link

加载bert时报错 #3

Open galigaligo opened 6 months ago

galigaligo commented 6 months ago

20240220095146-2

File "E:\xin\BV2\Bert-VITS2-Speech-main\onnx_infer\text\tokenizer.py", line 36, in BertTokenizerDict DebertaV2TokenizerFast.from_pretrained(ENGLISH_ONNX_LOCAL_DIR) File "D:\soft\miniconda\py310\lib\site-packages\transformers\tokenization_utils_base.py", line 2028, in from_pretrained return cls._from_pretrained( File "D:\soft\miniconda\py310\lib\site-packages\transformers\tokenization_utils_base.py", line 2060, in _from_pretrained slow_tokenizer = (cls.slow_tokenizer_class)._from_pretrained( File "D:\soft\miniconda\py310\lib\site-packages\transformers\tokenization_utils_base.py", line 2260, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "D:\soft\miniconda\py310\lib\site-packages\transformers\models\deberta_v2\tokenization_deberta_v2.py", line 138, in init self._tokenizer = SPMTokenizer( File "D:\soft\miniconda\py310\lib\site-packages\transformers\models\deberta_v2\tokenization_deberta_v2.py", line 309, in init spm.load(vocab_file) File "D:\soft\miniconda\py310\lib\site-packages\sentencepiece__init__.py", line 905, in Load return self.LoadFromFile(model_file) File "D:\soft\miniconda\py310\lib\site-packages\sentencepiece__init__.py", line 310, in LoadFromFile return _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg) RuntimeError: Internal: D:\a\sentencepiece\sentencepiece\src\sentencepiece_processor.cc(1102) [model_proto->ParseFromArray(serialized.data(), serialized.size())]

wildBigPanda commented 3 months ago

我也遇到一样的问题,请问你解决了吗?