Open Tedyyup opened 1 month ago
You have selected 'Shared' for "Swap Location." How much shared GPU memory do you have?
You have selected 'Shared' for "Swap Location." How much shared GPU memory do you have?
i have gtx 1650 4gb and 16 gb installed ram
You have 4gb of dedicated GPU memory. We are talking about "shared memory." Go to your taskmanger. Select GPU. Tell me what you see.
This is what I have. I have a RTX 4060 16gb. Thats 16gb of dedicated memory but I also have 8 gb of shared memory.
NVIDIA GeForce RTX 4060 Ti
Utilization 1%
Dedicated GPU memory 10.6/16.0 GB
Shared GPU memory 0.1/7.9 GB
GPU Memory 10.7/23.9 GB
Check this site out https://www.cgdirector.com/what-is-shared-gpu-memory/
have you tried selecting CPU insted of Shared?
Also noticed you reduced the image size to 584. That is way to low for a Flux generation. The picture will most likely be all "noise"
Also noticed you reduced the image size to 584. That is way to low for a Flux generation. The picture will most likely be all "noise"
That's wrong. Flux work resolution - from 0.1 mp to 2.0 mp.
You have 4gb of dedicated GPU memory. We are talking about "shared memory." Go to your taskmanger. Select GPU. Tell me what you see.
This is what I have. I have a RTX 4060 16gb. Thats 16gb of dedicated memory but I also have 8 gb of shared memory.
NVIDIA GeForce RTX 4060 Ti Utilization 1% Dedicated GPU memory 10.6/16.0 GB Shared GPU memory 0.1/7.9 GB GPU Memory 10.7/23.9 GB
Check this site out https://www.cgdirector.com/what-is-shared-gpu-memory/
Shared memory for gpu is 7.9gb, whenever I use flux nf4 model it works fine, but does not support nsfw, flux model on civitai support nsfw
have you tried selecting CPU insted of Shared?
Tried , no luck
Try increasing GPU weights to max or enter 10000 (it will adjust automatically) and try.
I'm trying to use FLUX dev model from CIVITAI on my Stable Diffusion Forge WEBUI, I'm using it because it can create NSFW images which nf4 model is not able create. My laptop specs are gtx 1650 4gb, 16 gb installed ram and my issue is this error coming whenever i try to create "[Memory Management] Target: KModel, Free GPU: 3232.70 MB, Model Require: 11350.07 MB, Previously Loaded: 0.00 MB, Inference Require: 189.00 MB, Remaining: -8306.37 MB, CUDA error: out of memory CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect. For debugging consider passing CUDA_LAUNCH_BLOCKING=1. Compile with
TORCH_USE_CUDA_DSA
to enable device-side assertions.""even after using vae, it's not working can some one please help me with it?