lobehub / sd-webui-lobe-theme

🅰️ Lobe theme - The modern theme for stable diffusion webui, exquisite interface design, highly customizable UI, and efficiency boosting features.
https://github.com/AUTOMATIC1111/stable-diffusion-webui
GNU Affero General Public License v3.0
2.45k stars 223 forks source link

[Bug] 右边的额外网络和已安装插件无限加载 #513

Open NaughtDZ opened 10 months ago

NaughtDZ commented 10 months ago

👀 检查清单 | Initial checklist

💻 系统环境 | Operating System

Windows

🌐 浏览器 | Browser

Edge

📦 SD Wbui 版本或提交 | Version or Commit

No response

📦 Lobe Theme 版本或提交 | Version or Commit

No response

🐛 问题描述 | Bug Description

以下是我用的插件,额外网络无限加载,已安装的扩展插件也是无限加载 2024/01/09 21:58

a1111-sd-webui-tagcomplete 2024/01/10 23:42 adetailer 2023/11/24 10:28 multidiffusion-upscaler-for-automatic1111 2024/01/06 23:52 OneButtonPrompt 2023/11/24 10:28 prompt_translator 2024/01/12 13:00 sd-civitai-browser-plus 2023/11/24 10:28 sd-webui-3d-open-pose-editor 2023/12/20 23:54 sd-webui-animatediff 2023/11/24 10:28 sd-webui-aspect-ratio-helper 2024/01/12 13:00 sd-webui-controlnet 2024/01/07 11:35 sd-webui-deepdanbooru-object-recognition 2023/11/24 10:30 sd-webui-depth-lib 2024/01/08 17:58 sd-webui-e621-prompt 2023/11/24 17:45 sd-webui-freeu 2024/01/12 18:41 sd-webui-infinite-image-browsing 2024/01/12 17:48 sd-webui-lobe-theme 2023/11/24 10:30 sd-webui-oldsix-prompt 2023/12/26 22:37 sd-webui-openpose-editor 2024/01/06 23:52 sd-webui-reactor 2023/11/24 10:28 sd_delete_button 2023/11/27 21:34 stable-diffusion-webui-localization-zh_CN 2023/11/24 10:30 stable-diffusion-webui-localization-zh_Hans 2023/11/24 10:30 stable-diffusion-webui-promptgen 2023/11/24 10:30 stable-diffusion-webui-state 2023/11/24 10:30 stable-diffusion-webui-wildcards 2024/01/07 11:39 ultimate-upscale-for-automatic1111 2023/11/24 10:30 webui-qrcode-generator


亲测用--no-gradio-queue 可以解决但是会导致一堆报错: To create a public link, set share=True in launch(). Loading VAE weights specified in settings: I:\stable-diffusion-webui\models\VAE\kl-f8-anime2.vae.pt Applying attention optimization: xformers... done. 🤯 LobeTheme: Initializing... Startup time: 38.0s (prepare environment: 14.1s, import torch: 4.8s, import gradio: 1.8s, setup paths: 1.2s, initialize shared: 0.3s, other imports: 1.0s, setup codeformer: 0.3s, list SD models: 0.2s, load scripts: 11.0s, create ui: 1.9s, gradio launch: 1.3s). Model loaded in 7.6s (load weights from disk: 0.2s, create model: 1.0s, apply weights to model: 1.4s, load VAE: 0.3s, load textual inversion embeddings: 4.5s, calculate empty prompt: 0.1s). ERROR:asyncio:Exception in callback H11Protocol.timeout_keep_alive_handler() handle: <TimerHandle when=26031.187 H11Protocol.timeout_keep_alive_handler()> Traceback (most recent call last): File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_state.py", line 249, in _fire_event_triggered_transitions new_state = EVENT_TRIGGERED_TRANSITIONS[role][state][event_type] KeyError: <class 'h11._events.ConnectionClosed'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Program Files\Python310\lib\asyncio\events.py", line 80, in _run self._context.run(self._callback, *self._args) File "I:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 363, in timeout_keep_alive_handler self.conn.send(event) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 468, in send data_list = self.send_with_data_passthrough(event) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 493, in send_with_data_passthrough self._process_event(self.our_role, event) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 242, in _process_event self._cstate.process_event(role, type(event), server_switch_event) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_state.py", line 238, in process_event self._fire_event_triggered_transitions(role, event_type) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_state.py", line 251, in _fire_event_triggered_transitions raise LocalProtocolError( h11._util.LocalProtocolError: can't handle event type ConnectionClosed when role=SERVER and state=SEND_RESPONSE *** API error: POST: http://127.0.0.1:7860/run/predict {'error': 'LocalProtocolError', 'detail': '', 'body': '', 'errors': "Can't send data when our state is ERROR"} Traceback (most recent call last): File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in call await self.app(scope, receive, _send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 109, in call await response(scope, receive, send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 270, in call async with anyio.create_task_group() as task_group: File "I:\stable-diffusion-webui\venv\lib\site-packages\anyio_backends_asyncio.py", line 597, in aexit raise exceptions[0] File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 273, in wrap await func() File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 134, in stream_response return await super().stream_response(send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 255, in stream_response await send( File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in _send await send(message) File "I:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 490, in send output = self.conn.send(event=response) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 468, in send data_list = self.send_with_data_passthrough(event) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 483, in send_with_data_passthrough raise LocalProtocolError("Can't send data when our state is ERROR") h11._util.LocalProtocolError: Can't send data when our state is ERROR

ERROR: Exception in ASGI application Traceback (most recent call last): File "I:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 408, in run_asgi result = await app( # type: ignore[func-returns-value] File "I:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 84, in call return await self.app(scope, receive, send) File "I:\stable-diffusion-webui\venv\lib\site-packages\fastapi\applications.py", line 273, in call await super().call(scope, receive, send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\applications.py", line 122, in call await self.middleware_stack(scope, receive, send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 184, in call raise exc File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in call await self.app(scope, receive, _send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 109, in call await response(scope, receive, send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 270, in call async with anyio.create_task_group() as task_group: File "I:\stable-diffusion-webui\venv\lib\site-packages\anyio_backends_asyncio.py", line 597, in aexit raise exceptions[0] File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 273, in wrap await func() File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 134, in stream_response return await super().stream_response(send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 255, in stream_response await send( File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in _send await send(message) File "I:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 490, in send output = self.conn.send(event=response) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 468, in send data_list = self.send_with_data_passthrough(event) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 483, in send_with_data_passthrough raise LocalProtocolError("Can't send data when our state is ERROR") h11._util.LocalProtocolError: Can't send data when our state is ERROR

🚦 期望结果 | Expected Behavior

--no-gradio-queue虽然能让ui正常工作,但是带来的一堆报错可能会引发其他bug。希望本ui插件作者能研究出来为什么加了--no-gradio-queue就能解决这个问题,继而反向修复这个bug。

📷 复现步骤 | Recurrence Steps

No response

📝 补充信息 | Additional Information

No response

lobehubbot commented 10 months ago

👀 @NaughtDZ

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

kaalibro commented 10 months ago

Try to uninstall the "sd-webui-depth-lib" and install it from this fork: https://github.com/wywywywy/sd-webui-depth-lib

Please inform if the issue with --no-gradio-queue persists after reinstalling this extension.

NaughtDZ commented 10 months ago

Try to uninstall the "sd-webui-depth-lib" and install it from this fork: https://github.com/wywywywy/sd-webui-depth-lib

Please inform if the issue with --no-gradio-queue persists after reinstalling this extension.


Still error:


To create a public link, set share=True in launch(). 🤯 LobeTheme: Initializing... Startup time: 44.2s (prepare environment: 15.1s, import torch: 6.5s, import gradio: 2.5s, setup paths: 2.0s, initialize shared: 0.3s, other imports: 1.7s, setup codeformer: 0.4s, list SD models: 0.4s, load scripts: 9.9s, create ui: 4.0s, gradio launch: 1.0s, add APIs: 0.1s, app_started_callback: 0.1s). Loading VAE weights specified in settings: I:\stable-diffusion-webui\models\VAE\kl-f8-anime2.vae.pt Applying attention optimization: xformers... done. ERROR:asyncio:Exception in callback H11Protocol.timeout_keep_alive_handler() handle: <TimerHandle when=52773.156 H11Protocol.timeout_keep_alive_handler()> Traceback (most recent call last): File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_state.py", line 249, in _fire_event_triggered_transitions new_state = EVENT_TRIGGERED_TRANSITIONS[role][state][event_type] KeyError: <class 'h11._events.ConnectionClosed'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Program Files\Python310\lib\asyncio\events.py", line 80, in _run self._context.run(self._callback, *self._args) File "I:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 363, in timeout_keep_alive_handler self.conn.send(event) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 468, in send data_list = self.send_with_data_passthrough(event) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 493, in send_with_data_passthrough self._process_event(self.our_role, event) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 242, in _process_event self._cstate.process_event(role, type(event), server_switch_event) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_state.py", line 238, in process_event self._fire_event_triggered_transitions(role, event_type) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_state.py", line 251, in _fire_event_triggered_transitions raise LocalProtocolError( h11._util.LocalProtocolError: can't handle event type ConnectionClosed when role=SERVER and state=SEND_RESPONSE Model loaded in 20.1s (load weights from disk: 1.0s, create model: 0.6s, apply weights to model: 13.6s, load VAE: 1.4s, load textual inversion embeddings: 3.0s, calculate empty prompt: 0.4s). *** API error: POST: http://127.0.0.1:7860/run/predict {'error': 'LocalProtocolError', 'detail': '', 'body': '', 'errors': "Can't send data when our state is ERROR"} Traceback (most recent call last): File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in call await self.app(scope, receive, _send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 109, in call await response(scope, receive, send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 270, in call async with anyio.create_task_group() as task_group: File "I:\stable-diffusion-webui\venv\lib\site-packages\anyio_backends_asyncio.py", line 597, in aexit raise exceptions[0] File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 273, in wrap await func() File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 134, in stream_response return await super().stream_response(send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 255, in stream_response await send( File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in _send await send(message) File "I:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 490, in send output = self.conn.send(event=response) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 468, in send data_list = self.send_with_data_passthrough(event) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 483, in send_with_data_passthrough raise LocalProtocolError("Can't send data when our state is ERROR") h11._util.LocalProtocolError: Can't send data when our state is ERROR


ERROR: Exception in ASGI application Traceback (most recent call last): File "I:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 408, in run_asgi result = await app( # type: ignore[func-returns-value] File "I:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 84, in call return await self.app(scope, receive, send) File "I:\stable-diffusion-webui\venv\lib\site-packages\fastapi\applications.py", line 273, in call await super().call(scope, receive, send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\applications.py", line 122, in call await self.middleware_stack(scope, receive, send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 184, in call raise exc File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in call await self.app(scope, receive, _send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 109, in call await response(scope, receive, send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 270, in call async with anyio.create_task_group() as task_group: File "I:\stable-diffusion-webui\venv\lib\site-packages\anyio_backends_asyncio.py", line 597, in aexit raise exceptions[0] File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 273, in wrap await func() File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 134, in stream_response return await super().stream_response(send) File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 255, in stream_response await send( File "I:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in _send await send(message) File "I:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 490, in send output = self.conn.send(event=response) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 468, in send data_list = self.send_with_data_passthrough(event) File "I:\stable-diffusion-webui\venv\lib\site-packages\h11_connection.py", line 483, in send_with_data_passthrough raise LocalProtocolError("Can't send data when our state is ERROR") h11._util.LocalProtocolError: Can't send data when our state is ERROR

NaughtDZ commented 10 months ago

额外说一下,加--no-gradio-queue会导致Civit AI Browser插件下载出问题,这可没法取舍了。只能要下载或者更新模型时候去掉启动项,然后再加回来,非常麻烦

lobehubbot commented 10 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


In addition, adding --no-gradio-queue will cause problems with downloading the Civit AI Browser plug-in, which is no choice. You can only remove the startup item when downloading or updating the model, and then add it back, which is very troublesome.

herbieliang commented 9 months ago

请问这个问题有解决方案了吗?

lobehubbot commented 9 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Is there any solution to this problem?