pinokiofactory / cogstudio

256 stars 15 forks source link

Unable to download models #28

Closed mflux closed 2 weeks ago

mflux commented 2 weeks ago
diffusion_pytorch_model.safetensors:  90%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████▎           | 3.06G/3.39G [1:02:27<14:55, 364kB/s]
model-00001-of-00002.safetensors:  46%|███████████████████████████████████████████████████████▊                                                                  | 2.29G/4.99G [1:02:27<2:04:07, 364kB/s]
model-00002-of-00002.safetensors:  52%|███████████████████████████████████████████████████████████████▎                                                          | 2.35G/4.53G [1:02:27<1:30:51, 400kB/s]
Traceback (most recent call last): 52%|███████████████████████████████████████████████████████████████▎                                                          | 2.35G/4.53G [1:02:18<1:49:54, 331kB/s]
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\urllib3\response.py", line 748, in _error_catcher
    yield
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\urllib3\response.py", line 894, in _raw_read
    raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
urllib3.exceptions.IncompleteRead: IncompleteRead(1365272116 bytes read, 2706563108 more expected)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\requests\models.py", line 820, in generate
    yield from self.raw.stream(chunk_size, decode_content=True)
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\urllib3\response.py", line 1060, in stream
    data = self.read(amt=amt, decode_content=decode_content)
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\urllib3\response.py", line 977, in read
    data = self._raw_read(amt)
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\urllib3\response.py", line 872, in _raw_read
    with self._error_catcher():
  File "C:\pinokio\bin\miniconda\lib\contextlib.py", line 153, in __exit__
    self.gen.throw(typ, value, traceback)
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\urllib3\response.py", line 772, in _error_catcher
    raise ProtocolError(arg, e) from e
urllib3.exceptions.ProtocolError: ('Connection broken: IncompleteRead(1365272116 bytes read, 2706563108 more expected)', IncompleteRead(1365272116 bytes read, 2706563108 more expected))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\gradio\queueing.py", line 624, in process_events
    response = await route_utils.call_process_api(
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\gradio\route_utils.py", line 323, in call_process_api
    output = await app.get_blocks().process_api(
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\gradio\blocks.py", line 2018, in process_api
    result = await self.call_function(
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\gradio\blocks.py", line 1567, in call_function
    prediction = await anyio.to_thread.run_sync(  # type: ignore
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\anyio\to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\anyio\_backends\_asyncio.py", line 2441, in run_sync_in_worker_thread
    return await future
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\anyio\_backends\_asyncio.py", line 943, in run
    result = context.run(func, *args)
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\gradio\utils.py", line 846, in wrapper
    response = f(*args, **kwargs)
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\gradio\utils.py", line 846, in wrapper
    response = f(*args, **kwargs)
  File "C:\pinokio\api\cogstudio.git\app\inference\gradio_composite_demo\cogstudio.py", line 727, in generate
    latents, seed = infer(
  File "C:\pinokio\api\cogstudio.git\app\inference\gradio_composite_demo\cogstudio.py", line 217, in infer
    init(name, image_input, video_input, dtype, full_gpu)
  File "C:\pinokio\api\cogstudio.git\app\inference\gradio_composite_demo\cogstudio.py", line 57, in init
    init_txt2vid(name, dtype_str, full_gpu)
  File "C:\pinokio\api\cogstudio.git\app\inference\gradio_composite_demo\cogstudio.py", line 84, in init_txt2vid
    dtype = init_core(name, dtype_str)
  File "C:\pinokio\api\cogstudio.git\app\inference\gradio_composite_demo\cogstudio.py", line 67, in init_core
    pipe = CogVideoXPipeline.from_pretrained(name, torch_dtype=dtype).to(device)
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\diffusers\pipelines\pipeline_utils.py", line 725, in from_pretrained
    cached_folder = cls.download(
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\diffusers\pipelines\pipeline_utils.py", line 1433, in download
    cached_folder = snapshot_download(
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\huggingface_hub\_snapshot_download.py", line 293, in snapshot_download
    thread_map(
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\tqdm\contrib\concurrent.py", line 69, in thread_map
    return _executor_map(ThreadPoolExecutor, fn, *iterables, **tqdm_kwargs)
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\tqdm\contrib\concurrent.py", line 51, in _executor_map
    return list(tqdm_class(ex.map(fn, *iterables, chunksize=chunksize), **kwargs))
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\gradio\helpers.py", line 707, in __next__
    return next(current_iterable.iterable)  # type: ignore
  File "C:\pinokio\bin\miniconda\lib\concurrent\futures\_base.py", line 621, in result_iterator
    yield _result_or_cancel(fs.pop())
  File "C:\pinokio\bin\miniconda\lib\concurrent\futures\_base.py", line 319, in _result_or_cancel
    return fut.result(timeout)
  File "C:\pinokio\bin\miniconda\lib\concurrent\futures\_base.py", line 458, in result
    return self.__get_result()
  File "C:\pinokio\bin\miniconda\lib\concurrent\futures\_base.py", line 403, in __get_result
    raise self._exception
  File "C:\pinokio\bin\miniconda\lib\concurrent\futures\thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\huggingface_hub\_snapshot_download.py", line 267, in _inner_hf_hub_download
    return hf_hub_download(
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\huggingface_hub\file_download.py", line 862, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\huggingface_hub\file_download.py", line 1011, in _hf_hub_download_to_cache_dir
    _download_to_tmp_and_move(
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\huggingface_hub\file_download.py", line 1545, in _download_to_tmp_and_move
    http_get(
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\huggingface_hub\file_download.py", line 454, in http_get
    for chunk in r.iter_content(chunk_size=constants.DOWNLOAD_CHUNK_SIZE):
  File "C:\pinokio\api\cogstudio.git\app\env\lib\site-packages\requests\models.py", line 822, in generate
    raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: ('Connection broken: IncompleteRead(1365272116 bytes read, 2706563108 more expected)', IncompleteRead(1365272116 bytes read, 2706563108 more expected))

It's been stuck like this for days now, and eventually times out (as above). I've restarted multiple times, checked firewall, checked VPN.