zixaphir / Stable-Diffusion-Webui-Civitai-Helper

Stable Diffusion Webui Extension for Civitai, to manage your model much more easily.
175 stars 24 forks source link

[BUG] Model scan fails: asynchio/h11/uvicorn #39

Closed Policturn closed 7 months ago

Policturn commented 8 months ago
RuntimeError: dictionary changed size during iteration
提示:Python 运行时抛出了一个异常。请检查疑难解答页面。
Civitai Helper: Start scan_model
Civitai Helper: Scanning path: H:\sd-webui-aki-v4.4\models\Lora
Civitai Helper: Model metadata not needed for 64x64-10.safetensors
Civitai Helper: Downloading model image.
Civitai Helper: Checking preview image for model: H:\sd-webui-aki-v4.4\models\Lora\64x64-10.safetensors
Civitai Helper: Existing model image found. Skipping.
Civitai Helper: Model metadata not needed for abtmv0.22.safetensors
Civitai Helper: Downloading model image.
Civitai Helper: Checking preview image for model: H:\sd-webui-aki-v4.4\models\Lora\abtmv0.22.safetensors
Civitai Helper: Existing model image found. Skipping.
Civitai Helper: Model metadata not needed for anmi-ver3.safetensors
Civitai Helper: Downloading model image.
Civitai Helper: Checking preview image for model: H:\sd-webui-aki-v4.4\models\Lora\anmi-ver3.safetensors
Civitai Helper: Existing model image found. Skipping.
Civitai Helper: Model metadata not needed for anmi.safetensors
Civitai Helper: Downloading model image.
Civitai Helper: Checking preview image for model: H:\sd-webui-aki-v4.4\models\Lora\anmi.safetensors
Civitai Helper: This image is NSFW: Mature
Civitai Helper: Start downloading from: https://image.civitai.com/xG1nkqKTMzGDvpLrqFT7WA/8227cbf6-2da2-4dbe-bc18-bbc43e0d3b46/width=2048/978232.jpeg
Civitai Helper: Target file path: H:\sd-webui-aki-v4.4\models\Lora\anmi.preview.png
Civitai Helper: File size: 759081
Civitai Helper: Downloading to temp file: H:\sd-webui-aki-v4.4\models\Lora\anmi.preview.png.downloading
Civitai Helper: File Downloaded to: H:\sd-webui-aki-v4.4\models\Lora\anmi.preview.png
Civitai Helper: Creating model info for: gbaportrait.safetensors
Civitai Helper: Using SD Webui SHA256
Calculating sha256 for H:\sd-webui-aki-v4.4\models\Lora\gbaportrait.safetensors: ddf667109c5c0dd91f490207bd84b48594bfaebeab84cf2d7b6e19447f285f76
Civitai Helper: Request model info from civitai
ERROR:asyncio:Exception in callback H11Protocol.timeout_keep_alive_handler()
handle: <TimerHandle when=22788.0 H11Protocol.timeout_keep_alive_handler()>
Traceback (most recent call last):
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\h11\_state.py", line 249, in _fire_event_triggered_transitions
    new_state = EVENT_TRIGGERED_TRANSITIONS[role][state][event_type]
KeyError: <class 'h11._events.ConnectionClosed'>
提示:Python 运行时抛出了一个异常。请检查疑难解答页面。

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\PC\AppData\Local\Programs\Python\Python310\lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 363, in timeout_keep_alive_handler
    self.conn.send(event)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\h11\_connection.py", line 468, in send
    data_list = self.send_with_data_passthrough(event)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\h11\_connection.py", line 493, in send_with_data_passthrough
    self._process_event(self.our_role, event)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\h11\_connection.py", line 242, in _process_event
    self._cstate.process_event(role, type(event), server_switch_event)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\h11\_state.py", line 238, in process_event
    self._fire_event_triggered_transitions(role, event_type)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\h11\_state.py", line 251, in _fire_event_triggered_transitions
    raise LocalProtocolError(
h11._util.LocalProtocolError: can't handle event type ConnectionClosed when role=SERVER and state=SEND_RESPONSE
Civitai Helper: Fetching Parent Model Information
Civitai Helper: Request model info from civitai: 184199
Civitai Helper: Write model civitai info to file: H:\sd-webui-aki-v4.4\models\Lora\gbaportrait.civitai.info
Civitai Helper: Write model SD webui info to file: H:\sd-webui-aki-v4.4\models\Lora\gbaportrait.json
Civitai Helper: Write model webui info to file: H:\sd-webui-aki-v4.4\models\Lora\gbaportrait.json
Civitai Helper: delay: 0.2 second
Civitai Helper: Downloading model image.
Civitai Helper: Checking preview image for model: H:\sd-webui-aki-v4.4\models\Lora\gbaportrait.safetensors
Civitai Helper: This image is NSFW: Soft
Civitai Helper: Start downloading from: https://image.civitai.com/xG1nkqKTMzGDvpLrqFT7WA/cc6ccdad-6869-4e52-ba22-c3648f385ea0/width=1024/3297499.jpeg
Civitai Helper: Target file path: H:\sd-webui-aki-v4.4\models\Lora\gbaportrait.preview.png
Civitai Helper: File size: 175760
Civitai Helper: Downloading to temp file: H:\sd-webui-aki-v4.4\models\Lora\gbaportrait.preview.png.downloading
*** API error: POST: http://127.0.0.1:7860/api/predict {'error': 'LocalProtocolError', 'detail': '', 'body': '', 'errors': "Can't send data when our state is ERROR"}
    Traceback (most recent call last):
      File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in __call__
        await self.app(scope, receive, _send)
      File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\middleware\base.py", line 109, in __call__
        await response(scope, receive, send)
      File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\responses.py", line 270, in __call__
        async with anyio.create_task_group() as task_group:
      File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 597, in __aexit__
        raise exceptions[0]
      File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\responses.py", line 273, in wrap
        await func()
      File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\middleware\base.py", line 134, in stream_response
        return await super().stream_response(send)
      File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\responses.py", line 255, in stream_response
        await send(
      File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in _send
        await send(message)
      File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 490, in send
        output = self.conn.send(event=response)
      File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\h11\_connection.py", line 468, in send
        data_list = self.send_with_data_passthrough(event)
      File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\h11\_connection.py", line 483, in send_with_data_passthrough
        raise LocalProtocolError("Can't send data when our state is ERROR")
    h11._util.LocalProtocolError: Can't send data when our state is ERROR

---
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 408, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\fastapi\applications.py", line 273, in __call__
    await super().__call__(scope, receive, send)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\middleware\errors.py", line 184, in __call__
    raise exc
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\middleware\base.py", line 109, in __call__
    await response(scope, receive, send)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\responses.py", line 270, in __call__
    async with anyio.create_task_group() as task_group:
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 597, in __aexit__
    raise exceptions[0]
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\responses.py", line 273, in wrap
    await func()
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\middleware\base.py", line 134, in stream_response
    return await super().stream_response(send)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\responses.py", line 255, in stream_response
    await send(
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in _send
    await send(message)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 490, in send
    output = self.conn.send(event=response)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\h11\_connection.py", line 468, in send
    data_list = self.send_with_data_passthrough(event)
  File "H:\sd-webui-aki-v4.4\venv\lib\site-packages\h11\_connection.py", line 483, in send_with_data_passthrough
    raise LocalProtocolError("Can't send data when our state is ERROR")
h11._util.LocalProtocolError: Can't send data when our state is ERROR
'AsyncRequest' object has no attribute '_json_response_data'

此报错会伴随着一个为下载完成的xxx.preview.png.downloading文件,可以通过更换vpn节点来影响下载到第几个的时候触发 在控制台可以看到一个下载种的读条卡在中间或100%的显示。

This error will be accompanied by a completed for download XXX. Preview. PNG. Downloading files, can be downloaded by changing the VPN nodes to influence by which a trigger

In the console you can see a download type of read bar stuck in the middle or 100% display.

希望能加入一个强制终止当前进程的按钮,这样卡死可以省去重启的过程

Want to add a button to force the current process to terminate, so that the deadlock can save the restart process

zixaphir commented 8 months ago

I'm trying to solve this but it is a very low-level error that may be above my current ability to fix.

QuietNoise commented 8 months ago

From what I've seen so far the two errors almost always accompany each other.

First you get the one starting with ERROR:asyncio:Exception in callback H11Protocol.timeout_keep_alive_handler(). It might be a coincidence but this usually happens either during / after hashing big files or after requesting model info Civitai Helper: Request model info from civitai.

From my understandng the first error sets the error state or something to some component and when Civitai Helper tries to download preview image afterwards it crashes completely with the *** API error: POST: http://127.0.0.1:7860/api/predict ... Can't send data when our state is ERROR error.

The good thing is if I reload the webui in the browser I can continue scanning where it left off so ultimately I will end-up with everything scanned.

Policturn commented 8 months ago

I'm trying to solve this but it is a very low-level error that may be above my current ability to fix.

我想可以提供一个重置按钮来在进程陷入终止时强制进入初始状态,这样就可以避免多次的重启,一定程度上缓解bug带来的问题。同时希望可以自动清楚文件列表里残留的“正在下载文件”

I think we can provide a reset button to force the initial state when the process is terminated, so that we can avoid multiple restarts and alleviate the problem caused by bugs to some extent. At the same time, I hope to automatically clear the remaining "download file" in the file list.

zixaphir commented 8 months ago

I'm trying to solve this but it is a very low-level error that may be above my current ability to fix.

我想可以提供一个重置按钮来在进程陷入终止时强制进入初始状态,这样就可以避免多次的重启,一定程度上缓解bug带来的问题。同时希望可以自动清楚文件列表里残留的“正在下载文件”

I think we can provide a reset button to force the initial state when the process is terminated, so that we can avoid multiple restarts and alleviate the problem caused by bugs to some extent. At the same time, I hope to automatically clear the remaining "download file" in the file list.

This is what I mean, though. I'm not entirely sure what component needs to be restarted. It is not failing on the download end as far as I can tell: the download is handled entirely in Python's requests module. h11 isn't used by requests as far as a can tell. It seems to be an error between the webui server and the client in Gradio, which I'm not entirely sure how to directly manipulate. But I'm showing my inexperience here because based on the error message, this is just my best guess.

zixaphir commented 7 months ago

I currently believe that the issue is happening somewhere around https://github.com/gradio-app/gradio/blob/1dc797adcba8424bac87219713638442ebe2d841/gradio/queueing.py#L356-L382, but I am not sure how to go about fixing it/preventing it/resetting it on my end. I'm adding some functionality to try to keep the connection from timing out, which seems to be the root of the issue, but I have not been able to replicate the issue on my end, which prevents me from trying to trace it any further.

zixaphir commented 7 months ago

The queuing code in Gradio where this issue seems to be occurring has changed significantly in newer versions of Gradio, so it may resolve itself if webui updates to require a newer version of Gradio.

Policturn commented 7 months ago

The queuing code in Gradio where this issue seems to be occurring has changed significantly in newer versions of Gradio, so it may resolve itself if webui updates to require a newer version of Gradio.

I tried to return the version to 1.7.0

This issue did not occur again and was still present in the previous 1.7.3 release

zixaphir commented 7 months ago

Is this still happening in v1.8.0?

noonz66 commented 7 months ago

updated, and I still get the error:

Civitai Helper: Downloading model image.
Civitai Helper: Checking preview image for model: E:\stable-diffusion-webui\models\Stable-diffusion\aegithalidaeV1_psaltriparusMinimus.safetensors
Civitai Helper: Existing model image found. Skipping.
Civitai Helper: Creating model info for: kohakuXLBeta_beta7Pro.safetensors
Civitai Helper: Using SD Webui SHA256
ERROR:asyncio:Exception in callback H11Protocol.timeout_keep_alive_handler()
handle: <TimerHandle when=5284.781 H11Protocol.timeout_keep_alive_handler()>
Traceback (most recent call last):
  File "E:\stable-diffusion-webui\venv\lib\site-packages\h11\_state.py", line 249, in _fire_event_triggered_transitions
    new_state = EVENT_TRIGGERED_TRANSITIONS[role][state][event_type]
KeyError: <class 'h11._events.ConnectionClosed'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Program Files\Python310\lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 363, in timeout_keep_alive_handler
    self.conn.send(event)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 468, in send
    data_list = self.send_with_data_passthrough(event)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 493, in send_with_data_passthrough
    self._process_event(self.our_role, event)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 242, in _process_event
    self._cstate.process_event(role, type(event), server_switch_event)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\h11\_state.py", line 238, in process_event
    self._fire_event_triggered_transitions(role, event_type)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\h11\_state.py", line 251, in _fire_event_triggered_transitions
    raise LocalProtocolError(
h11._util.LocalProtocolError: can't handle event type ConnectionClosed when role=SERVER and state=SEND_RESPONSE
Calculating sha256 for E:\stable-diffusion-webui\models\Stable-diffusion\kohakuXLBeta_beta7Pro.safetensors: 51a0c178b7ad47966bdf3397e51a385edf151ed387ba31bab447a12071507933
Civitai Helper: Request model info from civitai
Civitai Helper: Fetching Parent Model Information
Civitai Helper: Request model info from civitai: 162577
Civitai Helper: Write model civitai info to file: E:\stable-diffusion-webui\models\Stable-diffusion\kohakuXLBeta_beta7Pro.civitai.info
Civitai Helper: Write model webui info to file: E:\stable-diffusion-webui\models\Stable-diffusion\kohakuXLBeta_beta7Pro.json
Civitai Helper: Downloading model image.
Civitai Helper: Checking preview image for model: E:\stable-diffusion-webui\models\Stable-diffusion\kohakuXLBeta_beta7Pro.safetensors
Civitai Helper: Start downloading from: https://image.civitai.com/xG1nkqKTMzGDvpLrqFT7WA/3664a94e-efca-4163-b4ed-82da95528707/width=1728/3237064.jpeg
Civitai Helper: Target file path: E:\stable-diffusion-webui\models\Stable-diffusion\kohakuXLBeta_beta7Pro.preview.png
Civitai Helper: File size: 465549 (454.64K)
Civitai Helper: Downloading to temp file: E:\stable-diffusion-webui\models\Stable-diffusion\kohakuXLBeta_beta7Pro.preview.png.downloading
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 455k/455k [00:00<00:00, 991kiB/s] 
Civitai Helper: File Downloaded to: E:\stable-diffusion-webui\models\Stable-diffusion\kohakuXLBeta_beta7Pro.preview.png
Civitai Helper: Creating model info for: animeArtDiffusionXL_alpha3.safetensors
*** API error: POST: http://127.0.0.1:7861/api/predict {'error': 'LocalProtocolError', 'detail': '', 'body': '', 'errors': "Can't send data when our state is ERROR"}
    Traceback (most recent call last):
      File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in __call__
        await self.app(scope, receive, _send)
      File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 109, in __call__
        await response(scope, receive, send)
      File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 270, in __call__
        async with anyio.create_task_group() as task_group:
      File "E:\stable-diffusion-webui\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 597, in __aexit__
        raise exceptions[0]
      File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 273, in wrap
        await func()
      File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 134, in stream_response
        return await super().stream_response(send)
      File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 255, in stream_response
        await send(
      File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in _send
        await send(message)
      File "E:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 490, in send
        output = self.conn.send(event=response)
      File "E:\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 468, in send
        data_list = self.send_with_data_passthrough(event)
      File "E:\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 483, in send_with_data_passthrough
        raise LocalProtocolError("Can't send data when our state is ERROR")
    h11._util.LocalProtocolError: Can't send data when our state is ERROR

---
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "E:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 408, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "E:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\fastapi\applications.py", line 273, in __call__
    await super().__call__(scope, receive, send)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 184, in __call__
    raise exc
  File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 109, in __call__
    await response(scope, receive, send)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 270, in __call__
    async with anyio.create_task_group() as task_group:
  File "E:\stable-diffusion-webui\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 597, in __aexit__
    raise exceptions[0]
  File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 273, in wrap
    await func()
  File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 134, in stream_response
    return await super().stream_response(send)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 255, in stream_response
    await send(
  File "E:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in _send
    await send(message)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 490, in send
    output = self.conn.send(event=response)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 468, in send
    data_list = self.send_with_data_passthrough(event)
  File "E:\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 483, in send_with_data_passthrough
    raise LocalProtocolError("Can't send data when our state is ERROR")
h11._util.LocalProtocolError: Can't send data when our state is ERROR
'AsyncRequest' object has no attribute '_json_response_data'
QuietNoise commented 7 months ago

Still happening in 1.8 with the same frequency I reckon, Wish I was more into python 😄 Is H11 used for connection between browser client and webgui on the back?

Wild guessing here again but it feels like whenever something blocking is happening (i.e. big file hash calculating on slower drives or connection to civitai takes longer than it needs) that H11 thing is unable to keep the connection alive and fails / closes the connection? Since I moved all my models to faster drive this is now happening exclusively when extension is trying to get something from civitai.

Maybe there is a setting somewhere to increase the H11's timeout ticker or move the blocking tasks to separate non-blocking thread... I don't know, I'm waffling here 😛

I wish I could give it more time but refreshing the webui to keep going is good enough for me for now.

zixaphir commented 7 months ago

Is H11 used for connection between browser client and webgui on the back?

This seems to be the case, yes. Webui uses Gradio, which manages the core backend everything else is connected to. Gradio is the server, the client interface, and handles interaction between the user and the AI models. Extensions are also built in Gradio (which has been quite a headache for me because its interface blocks are very static once created). Gradio uses h11, which is a minimal HTTP protocol implementation that's supposed to only implement the protocol and leaves actually building the interactions with it to you, which is what Gradio has done. (You can read h11's self description here: https://h11.readthedocs.io/en/latest/).

The problem is that I feed Gradio a function that tells it what information I want to send and receive, and then it handles actually sending and receiving it. I can't touch the connection itself, so when it fails, this extension doesn't actually know it failed, and Gradio just continues to wait for its next response. I have no idea how to ask Gradio to try to recover, because all I can give Gradio are sender and receiver functions. I don't know how to give it an error handler.

zixaphir commented 7 months ago

I've added progress bars in an attempt to "keep the connection alive" by tracking the progress of the various steps that go into scanning models, thereby pinging the UI more often. Please let me know if this has helped.

QuietNoise commented 7 months ago

I had over 100 models to update. It didn't crash once. It could have been a lucky fluke so I'll keep you posted if anything changes. Nevertheless, the progress bars are mesmerising regardless if this bug persists or not 😝

Update 2 Days later: I've run few more scans since with 1.5 and SDXL checkpoints and loras and so far didn't experience any crash.

zixaphir commented 7 months ago

Alright, awesome! I'm going to go ahead and close this. If anyone else runs into this issue from here, feel free to reply here and I'll reopen this

Kaneda56 commented 3 months ago

I have the error "416: Range Not Satisfiable" when I'm scanning Civitai Helper: Checking preview image for model: S:\sept23\stable-diffusion-webui\models\Stable-diffusion\Réaliste\analogMadness_v70.safetensors Civitai Helper: This image is NSFW: Soft Civitai Helper: Start downloading from: https://image.civitai.com/xG1nkqKTMzGDvpLrqFT7WA/753f6342-592f-47a0-bfd0-8a98dca32ddc/width=1152/4559649.jpeg Civitai Helper: Target file path: S:\sept23\stable-diffusion-webui\models\Stable-diffusion\Réaliste\analogMadness_v70.preview.png Civitai Helper: File size: 138874 (135.62K) Civitai Helper: Downloading to temp file: S:\sept23\stable-diffusion-webui\models\Stable-diffusion\Réaliste\analogMadness_v70.preview.png.downloading Civitai Helper: Resuming partially downloaded file from progress: 138874 Civitai Helper: GET Request failed with error code: 416: Range Not Satisfiable Traceback (most recent call last): File "S:\sept23\stable-diffusion-webui\venv\lib\site-packages\gradio\routes.py", line 488, in run_predict output = await app.get_blocks().process_api( File "S:\sept23\stable-diffusion-webui\venv\lib\site-packages\gradio\blocks.py", line 1431, in process_api result = await self.call_function( File "S:\sept23\stable-diffusion-webui\venv\lib\site-packages\gradio\blocks.py", line 1117, in call_function prediction = await utils.async_iteration(iterator) File "S:\sept23\stable-diffusion-webui\venv\lib\site-packages\gradio\utils.py", line 350, in async_iteration return await iterator.__anext__() File "S:\sept23\stable-diffusion-webui\venv\lib\site-packages\gradio\utils.py", line 343, in __anext__ return await anyio.to_thread.run_sync( File "S:\sept23\stable-diffusion-webui\venv\lib\site-packages\anyio\to_thread.py", line 33, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "S:\sept23\stable-diffusion-webui\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 877, in run_sync_in_worker_thread

Issue is the same with Forge and Automatic 1111

zixaphir commented 3 months ago

Issue is the same with Forge and Automatic 1111

This is a different issue. Please open a new issue.