XLabs-AI / x-flux-comfyui

Apache License 2.0
1.14k stars 73 forks source link

- **Exception Message:** cuDNN Frontend error: CUDNN_BACKEND_OPERATION: cudnnFinalize Failed cudnn_status: CUDNN_STATUS_BAD_PARAM #145

Open AS1ING opened 4 weeks ago

AS1ING commented 4 weeks ago

XLabs-AI/flux-ip-adapter-v2

ComfyUI Error Report

Error Details


## System Information
- **ComfyUI Version:** v0.2.4-12-g669d9e4
- **Arguments:** F:\StabilityMatrix-win-x64\Data\Packages\ComfyUI Test\main.py --preview-method auto --auto-launch --fast --reserve-vram 0.8
- **OS:** nt
- **Python Version:** 3.10.11 (tags/v3.10.11:7d4cc5a, Apr  5 2023, 00:38:17) [MSC v.1929 64 bit (AMD64)]
- **Embedded Python:** false
- **PyTorch Version:** 2.5.0+cu124
## Devices

- **Name:** cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
  - **Type:** cuda
  - **VRAM Total:** 25756696576
  - **VRAM Free:** 2406419424
  - **Torch VRAM Total:** 20434649088
  - **Torch VRAM Free:** 34708448
caslix commented 4 weeks ago

Hello! I have the same problem.

rgfx commented 3 weeks ago

This error occurs on the new workspace that was given. If I used the old one, and switch the modal to the new one it works.

christming commented 3 weeks ago

This error occurs on the new workspace that was given. If I used the old one, and switch the modal to the new one it works.

Hello~ how do you fix this bug? Can you give some more details here? I meet the same error with use the new XLabs-AI/flux-ip-adapter-v2

xlibfly commented 3 weeks ago

I have the same problem.

lior007 commented 3 weeks ago

same

Orenji-Tangerine commented 3 weeks ago

I have the same problem.

xieyao2 commented 3 weeks ago

same problem.rgfx how did u fix it?

xieyao2 commented 3 weeks ago

XLabs-AI/flux-ip-adapter-v2 14700kf 4060ti 16gb ram 64gb torch2.5.0 + cuda12.4+xformers0.0.28.post2

xieyao2 commented 3 weeks ago

when i use one image,the workflow is ok.