Plachtaa / seed-vc

State-of-the-Art zero-shot voice conversion & singing voice conversion with in context learning
GNU General Public License v3.0
679 stars 76 forks source link

It seems that the model download failed? I don't quite understand, how to solve this? #39

Open qinyu521 opened 3 weeks ago

qinyu521 commented 3 weeks ago

D:\360\seed-vc\seed-vc\app.py:54: FutureWarning: You are using torch.load with weights_only=False (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for weights_only will be flipped to True. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via torch.serialization.add_safe_globals. We recommend you start setting weights_only=True for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature. ckpt_params = torch.load(ckpt_path, map_location="cpu") model.safetensors: 0%| | 0.00/967M [00:18<?, ?B/s] Traceback (most recent call last): File "D:\360\seed-vc\seed-vc\env\lib\site-packages\urllib3\response.py", line 748, in _error_catcher yield File "D:\360\seed-vc\seed-vc\env\lib\site-packages\urllib3\response.py", line 894, in _raw_read raise IncompleteRead(self._fp_bytes_read, self.length_remaining) urllib3.exceptions.IncompleteRead: IncompleteRead(5316085 bytes read, 961678995 more expected)

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "D:\360\seed-vc\seed-vc\env\lib\site-packages\requests\models.py", line 820, in generate yield from self.raw.stream(chunk_size, decode_content=True) File "D:\360\seed-vc\seed-vc\env\lib\site-packages\urllib3\response.py", line 1060, in stream data = self.read(amt=amt, decode_content=decode_content) File "D:\360\seed-vc\seed-vc\env\lib\site-packages\urllib3\response.py", line 977, in read data = self._raw_read(amt) File "D:\360\seed-vc\seed-vc\env\lib\site-packages\urllib3\response.py", line 872, in _raw_read with self._error_catcher(): File "D:\360\seed-vc\seed-vc\env\lib\contextlib.py", line 153, in exit self.gen.throw(typ, value, traceback) File "D:\360\seed-vc\seed-vc\env\lib\site-packages\urllib3\response.py", line 772, in _error_catcher raise ProtocolError(arg, e) from e urllib3.exceptions.ProtocolError: ('Connection broken: IncompleteRead(5316085 bytes read, 961678995 more expected)', IncompleteRead(5316085 bytes read, 961678995 more expected))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "D:\360\seed-vc\seed-vc\app.py", line 66, in whisper_model = WhisperModel.from_pretrained(whisper_name, torch_dtype=torch.float16).to(device) File "D:\360\seed-vc\seed-vc\env\lib\site-packages\transformers\modeling_utils.py", line 3809, in from_pretrained resolved_archive_file = cached_file(pretrained_model_name_or_path, filename, cached_file_kwargs) File "D:\360\seed-vc\seed-vc\env\lib\site-packages\transformers\utils\hub.py", line 403, in cached_file resolved_file = hf_hub_download( File "D:\360\seed-vc\seed-vc\env\lib\site-packages\huggingface_hub\utils_deprecation.py", line 101, in inner_f return f(*args, *kwargs) File "D:\360\seed-vc\seed-vc\env\lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn return fn(args, kwargs) File "D:\360\seed-vc\seed-vc\env\lib\site-packages\huggingface_hub\file_download.py", line 1232, in hf_hub_download return _hf_hub_download_to_cache_dir( File "D:\360\seed-vc\seed-vc\env\lib\site-packages\huggingface_hub\file_download.py", line 1381, in _hf_hub_download_to_cache_dir _download_to_tmp_and_move( File "D:\360\seed-vc\seed-vc\env\lib\site-packages\huggingface_hub\file_download.py", line 1915, in _download_to_tmp_and_move http_get( File "D:\360\seed-vc\seed-vc\env\lib\site-packages\huggingface_hub\file_download.py", line 541, in http_get for chunk in r.iter_content(chunk_size=constants.DOWNLOAD_CHUNK_SIZE): File "D:\360\seed-vc\seed-vc\env\lib\site-packages\requests\models.py", line 822, in generate raise ChunkedEncodingError(e) requests.exceptions.ChunkedEncodingError: ('Connection broken: IncompleteRead(5316085 bytes read, 961678995 more expected)', IncompleteRead(5316085 bytes read, 961678995 more expected)) Press any key to continue . . .

Plachtaa commented 3 weeks ago

If you are in mainland China, try passing environment variable HF_ENDPOINT=https://hf-mirror.com to use mirror site. Please let me know if this is not your case