Open andyzhou1982 opened 1 year ago
same question
Same here with 1.9.0 version :(
2024-04-15 21:30:03.2367069 [W:onnxruntime:Default, tensorrt_execution_provider.h:83 onnxruntime::TensorrtLogger::log] [2024-04-15 19:30:03 WARNING] onnx2trt_utils.cpp:369: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
2024-04-15 21:30:38.3152924 [W:onnxruntime:Default, tensorrt_execution_provider.h:83 onnxruntime::TensorrtLogger::log] [2024-04-15 19:30:38 WARNING] The getMaxBatchSize() function should not be used with an engine built from a network created with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag. This function will always return 1.
2024-04-15 21:30:38.3766681 [W:onnxruntime:Default, tensorrt_execution_provider.h:83 onnxruntime::TensorrtLogger::log] [2024-04-15 19:30:38 WARNING] The getMaxBatchSize() function should not be used with an engine built from a network created with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag. This function will always return 1.
ERROR:asyncio:Exception in callback H11Protocol.timeout_keep_alive_handler()
handle: <TimerHandle when=3758.609 H11Protocol.timeout_keep_alive_handler()>
Traceback (most recent call last):
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\h11\_state.py", line 249, in _fire_event_triggered_transitions
new_state = EVENT_TRIGGERED_TRANSITIONS[role][state][event_type]
KeyError: <class 'h11._events.ConnectionClosed'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\Mykee\AppData\Local\Programs\Python\Python310\lib\asyncio\events.py", line 80, in _run
self._context.run(self._callback, *self._args)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 362, in timeout_keep_alive_handler
self.conn.send(event)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 468, in send
data_list = self.send_with_data_passthrough(event)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 493, in send_with_data_passthrough
self._process_event(self.our_role, event)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 242, in _process_event
self._cstate.process_event(role, type(event), server_switch_event)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\h11\_state.py", line 238, in process_event
self._fire_event_triggered_transitions(role, event_type)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\h11\_state.py", line 251, in _fire_event_triggered_transitions
raise LocalProtocolError(
h11._util.LocalProtocolError: can't handle event type ConnectionClosed when role=SERVER and state=SEND_RESPONSE
*** API error: POST: http://127.0.0.1:7860/api/predict {'error': 'LocalProtocolError', 'detail': '', 'body': '', 'errors': "Can't send data when our state is ERROR"}
Traceback (most recent call last):
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in __call__
await self.app(scope, receive, _send)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 109, in __call__
await response(scope, receive, send)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 270, in __call__
async with anyio.create_task_group() as task_group:
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 662, in __aexit__
raise exceptions[0]
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 273, in wrap
await func()
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 134, in stream_response
return await super().stream_response(send)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 255, in stream_response
await send(
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in _send
await send(message)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 494, in send
output = self.conn.send(event)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 468, in send
data_list = self.send_with_data_passthrough(event)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 483, in send_with_data_passthrough
raise LocalProtocolError("Can't send data when our state is ERROR")
h11._util.LocalProtocolError: Can't send data when our state is ERROR
---
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 407, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in __call__
return await self.app(scope, receive, send)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\fastapi\applications.py", line 273, in __call__
await super().__call__(scope, receive, send)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\applications.py", line 122, in __call__
await self.middleware_stack(scope, receive, send)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 184, in __call__
raise exc
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in __call__
await self.app(scope, receive, _send)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 109, in __call__
await response(scope, receive, send)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 270, in __call__
async with anyio.create_task_group() as task_group:
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 662, in __aexit__
raise exceptions[0]
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 273, in wrap
await func()
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 134, in stream_response
return await super().stream_response(send)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 255, in stream_response
await send(
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in _send
await send(message)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 494, in send
output = self.conn.send(event)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 468, in send
data_list = self.send_with_data_passthrough(event)
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 483, in send_with_data_passthrough
raise LocalProtocolError("Can't send data when our state is ERROR")
h11._util.LocalProtocolError: Can't send data when our state is ERROR
'AsyncRequest' object has no attribute '_json_response_data'
ERROR:asyncio:Task exception was never retrieved
future: <Task finished name='6pug0y1gaqk_2176' coro=<Queue.process_events() done, defined at I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\gradio\queueing.py:384> exception=ValueError('[<gradio.queueing.Event object at 0x000001728772B580>] is not in list')>
Traceback (most recent call last):
File "I:\Stable-Diffusion-Automatic\stable-diffusion-webui\venv\lib\site-packages\gradio\queueing.py", line 471, in process_events
self.active_jobs[self.active_jobs.index(events)] = None
ValueError: [<gradio.queueing.Event object at 0x000001728772B580>] is not in list
Please integrate solution: https://github.com/onnx/tensorflow-onnx/issues/883 or this: https://github.com/NVIDIA/TensorRT/issues/2542 ?
when I use plugin rembg,there is an error message and then console breakdown