continue-revolution / sd-webui-animatediff

AnimateDiff for AUTOMATIC1111 Stable Diffusion WebUI
Other
3.11k stars 258 forks source link

[Bug]: #226

Closed bihailantian655 closed 1 year ago

bihailantian655 commented 1 year ago

Is there an existing issue for this?

Have you read FAQ on README?

What happened?

Send to miniPaint 错误 RuntimeError: CUDA error: invalid configuration argument CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect. For debugging consider passing CUDA_LAUNCH_BLOCKING=1. Compile with TORCH_USE_CUDA_DSA to enable device-side assertions.

Steps to reproduce the problem

default settings

What should have happened?

default settings

Commit where the problem happens

webui: 1.6

What browsers do you use to access the UI ?

No response

Command Line Arguments

Send to miniPaint
错误
RuntimeError: CUDA error: invalid configuration argument CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect. For debugging consider passing CUDA_LAUNCH_BLOCKING=1. Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions.

Console logs

Traceback (most recent call last):
  File "K:\sd-webui-aki-v4.4\python\lib\site-packages\gradio\routes.py", line 488, in run_predict
    output = await app.get_blocks().process_api(
  File "K:\sd-webui-aki-v4.4\python\lib\site-packages\gradio\blocks.py", line 1431, in process_api
    result = await self.call_function(
  File "K:\sd-webui-aki-v4.4\python\lib\site-packages\gradio\blocks.py", line 1103, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "K:\sd-webui-aki-v4.4\python\lib\site-packages\anyio\to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "K:\sd-webui-aki-v4.4\python\lib\site-packages\anyio\_backends\_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "K:\sd-webui-aki-v4.4\python\lib\site-packages\anyio\_backends\_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "K:\sd-webui-aki-v4.4\python\lib\site-packages\gradio\utils.py", line 707, in wrapper
    response = f(*args, **kwargs)
  File "K:\sd-webui-aki-v4.4\modules\call_queue.py", line 94, in f
    mem_stats = {k: -(v//-(1024*1024)) for k, v in shared.mem_mon.stop().items()}
  File "K:\sd-webui-aki-v4.4\modules\memmon.py", line 92, in stop
    return self.read()
  File "K:\sd-webui-aki-v4.4\modules\memmon.py", line 77, in read
    free, total = self.cuda_mem_get_info()
  File "K:\sd-webui-aki-v4.4\modules\memmon.py", line 34, in cuda_mem_get_info
    return torch.cuda.mem_get_info(index)
  File "K:\sd-webui-aki-v4.4\python\lib\site-packages\torch\cuda\memory.py", line 618, in mem_get_info
    return torch.cuda.cudart().cudaMemGetInfo(device)
RuntimeError: CUDA error: device-side assert triggered
提示:Python 运行时抛出了一个异常。请检查疑难解答页面。
CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.
For debugging consider passing CUDA_LAUNCH_BLOCKING=1.
Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions.

Additional information

bb

bihailantian655 commented 1 year ago

总帧数 设为 0

continue-revolution commented 1 year ago

~#204~

This seems like not something because of xformers, but I have no idea why.

continue-revolution commented 1 year ago

Please search for TORCH_USE_CUDA_DSA and see if one of those solutions work for you, especially https://github.com/continue-revolution/sd-webui-animatediff/issues/101#issuecomment-1731739213

bihailantian655 commented 1 year ago

大佬你是中国人么

bihailantian655 commented 1 year ago

秋叶 设置 cross attention sdp ,可行

continue-revolution commented 1 year ago

那就是xformers的问题,看样子基本上cuda的毛病都是因为xformers