Closed gsgoldma closed 1 year ago
can you share the prompt and settings you were using when the error occured? It may or may not be related to how we handle multiple interpolations in a single prompt.
Would appreciate if you could share the error message you are getting as well. I assume it is an out of memory error?
I need to know how many interpolations are in prompt and how they are arranged together to see if this isn't related to #20
funny thing is, sometimes if i wait a few minutes and try a regular prompt it will sometimes work again, and then i can go back to fusion prompting. it's a little bit unpredictable when it's going to happen and recover. been happening all morning. if i turn on a LoRa, and then disable it, it sometimes works again too. very strange.
[foggy night :desolate town:sherbert ice cream :, 1, 13] photorealistic, delicious Steps: 20, Sampler: Euler a, CFG scale: 7, Seed: 3945741849, Size: 512x512, Model hash: 47236be899, Model: New_realgrapenew, ENSD: 31337, Score: 5.31 Template: [foggy night :desolate town:sherbert ice cream :, 1, 13] photorealistic, delicious
35%|█████████████████████████████ | 7/20 [00:07<00:14, 1.13s/it] Error completing request:27, 11.63s/it] Arguments: ('task(a6g6nxppcupf62d)', '[foggy night :desolate town:sherbert ice cream :, 1, 13] photorealistic, delicious', '', [], 20, 0, False, False, 1, 1, 7, -1.0, -1.0, 0, 0, 0, False, 512, 512, False, 0.7, 2, 'Latent', 0, 0, 0, [], 0, 0, 0, 0, 0, 0.25, False, True, False, 0, -1, True, 'keyword prompt', 'keyword1, keyword2', 'None', 'textual inversion first', True, False, 1, False, False, False, 1.1, 1.5, 100, 0.7, False, False, True, False, False, 0, 'Gustavosta/MagicPrompt-Stable-Diffusion', '', False, 'x264', 'mci', 10, 0, False, True, True, True, 'intermediate', 'animation', True, False,, 0.65, 0.65, 'LoRA', 'None', 1, 1, 'LoRA', 'None', 1, 1, 'LoRA', 'None', 1, 1, 'LoRA', 'None', 1, 1, 'Refresh models', None, '', 'Get Tags', '', '', 0.9, 5, '0.0001', False, 'None', '', 0.1, False, 0, True, -1.0, False, '', True, False, 0, 384, 0, True, True, True, 1, '\n
{2$$artist1|artist2|artist3}
\n If $$ is not provided, then 1$$ is assumed.\n Available wildcards
\nWILDCARD_DIR: scripts/wildcards
mywildcards
will then become available.\n ', None, '', 'outputs', 1, '', 0, '', True, False, False, False, False, False, 1, 1, False, False, '', 1, True, 100, '', '', 8, True, 16, 'Median cut', 8, True, True, 16, 'PNN', True, False, None, None, '', '', '', '', 'Auto rename', {'label': 'Upload avatars config'}, 'Open outputs directory', 'Export to WebUI style', True, {'label': 'Presets'}, {'label': 'QC preview'}, '', [], 'Select', 'QC scan', 'Show pics', None, False, False, '', 25, True, 5.0, False, False, '', '', '', 'Positive', 0, ', ', True, 32, 0, 'Median cut', 'luminance', False, 'svg', True, True, False, 0.5, 1, '', 0, '', 0, '', True, False, False, False, 'Not set', True, True, '', '', '', '', '', 1.3, 'Not set', 'Not set', 1.3, 'Not set', 1.3, 'Not set', 1.3, 1.3, 'Not set', 1.3, 'Not set', 1.3, 'Not set', 1.3, 'Not set', 1.3, 'Not set', 1.3, 'Not set', False, 'None', '', 'None', 30, 4, 0, 0, False, 'None', 'ERROR: Exception in ASGI application Traceback (most recent call last): File "D:\stable-diffusion-webui\venv\lib\site-packages\anyio\streams\memory.py", line 94, in receive return self.receive_nowait() File "D:\stable-diffusion-webui\venv\lib\site-packages\anyio\streams\memory.py", line 89, in receive_nowait raise WouldBlock anyio.WouldBlock
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 77, in call_next message = await recv_stream.receive() File "D:\stable-diffusion-webui\venv\lib\site-packages\anyio\streams\memory.py", line 114, in receive raise EndOfStream anyio.EndOfStream
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 407, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "D:\stable-diffusion-webui\venv\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in call
return await self.app(scope, receive, send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\fastapi\applications.py", line 270, in call
await super().call(scope, receive, send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\applications.py", line 124, in call
await self.middleware_stack(scope, receive, send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 184, in call
raise exc
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in call
await self.app(scope, receive, _send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 106, in call
response = await self.dispatch_func(request, call_next)
File "D:\stable-diffusion-webui\extensions\auto-sd-paint-ext\backend\app.py", line 391, in app_encryption_middleware
res: StreamingResponse = await call_next(req)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 80, in call_next
raise app_exc
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 69, in coro
await self.app(scope, receive_or_disconnect, send_no_error)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 106, in call
response = await self.dispatch_func(request, call_next)
File "D:\stable-diffusion-webui\modules\api\api.py", line 96, in log_and_time
res: Response = await call_next(req)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 80, in call_next
raise app_exc
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 69, in coro
await self.app(scope, receive_or_disconnect, send_no_error)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\gzip.py", line 24, in call
await responder(scope, receive, send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\gzip.py", line 43, in call
await self.app(scope, receive, self.send_with_gzip)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\cors.py", line 92, in call
await self.simple_response(scope, receive, send, request_headers=headers)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\cors.py", line 147, in simple_response
await self.app(scope, receive, send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\exceptions.py", line 79, in call
raise exc
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\exceptions.py", line 68, in call
await self.app(scope, receive, sender)
File "D:\stable-diffusion-webui\venv\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 21, in call
raise e
File "D:\stable-diffusion-webui\venv\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 18, in call
await self.app(scope, receive, send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\routing.py", line 706, in call
await route.handle(scope, receive, send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\routing.py", line 276, in handle
await self.app(scope, receive, send)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\routing.py", line 66, in app
response = await func(request)
File "D:\stable-diffusion-webui\venv\lib\site-packages\fastapi\routing.py", line 235, in app
raw_response = await run_endpoint_function(
File "D:\stable-diffusion-webui\venv\lib\site-packages\fastapi\routing.py", line 163, in run_endpoint_function
return await run_in_threadpool(dependant.call, values)
File "D:\stable-diffusion-webui\venv\lib\site-packages\starlette\concurrency.py", line 41, in run_in_threadpool
return await anyio.to_thread.run_sync(func, args)
File "D:\stable-diffusion-webui\venv\lib\site-packages\anyio\to_thread.py", line 31, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "D:\stable-diffusion-webui\venv\lib\site-packages\anyio_backends_asyncio.py", line 937, in run_sync_in_worker_thread
return await future
File "D:\stable-diffusion-webui\venv\lib\site-packages\anyio_backends_asyncio.py", line 867, in run
result = context.run(func, args)
File "D:\stable-diffusion-webui\modules\progress.py", line 85, in progressapi
shared.state.set_current_image()
File "D:\stable-diffusion-webui\modules\shared.py", line 243, in set_current_image
self.do_set_current_image()
File "D:\stable-diffusion-webui\modules\shared.py", line 251, in do_set_current_image
self.assign_current_image(modules.sd_samplers.samples_to_image_grid(self.current_latent))
File "D:\stable-diffusion-webui\modules\sd_samplers_common.py", line 51, in samples_to_image_grid
return images.image_grid([single_sample_to_image(sample, approximation) for sample in samples])
File "D:\stable-diffusion-webui\modules\sd_samplers_common.py", line 51, in
Thanks for the details. I'll try to dive into this when I get some free time again during the week.
Thanks for the details. I'll try to dive into this when I get some free time again during the week.<
and it's not even that big of a deal, in fact, without restarting the program, it's still working fine after that error. I had to wait several minutes and switch prompts/turn off and loras. not sure which step, or just waiting for vram to be allocated back or something made it work again.
As the issue has been solved on your side, I'll close this for now. I'll reopen and continue to investigate if someone else runs into a similar situation.
I don't really understand what could be causing this, except maybe for a weird interaction between different extensions (including this one). I couldn't reproduce it on my side so far.
I'm leaving a note to my future self here in case I need to investigate further: I think the next step would be to install all of gsgoldma's extensions on https://github.com/AUTOMATIC1111/stable-diffusion-webui/commit/2c1bb46c7ad5b4536f6587d327a03f0ff7811c5d on my side to have any chance of reproducing. Ideally we should reduce the number of extensions to a minimum to make it easier to understand and fix the issue.
not sure if there' s a vram leak involving it, but I can do normal generations without prompt fusion syntax after that error still.