invoke-ai / InvokeAI

Invoke is a leading creative engine for Stable Diffusion models, empowering professionals, artists, and enthusiasts to generate and create visual media using the latest AI-driven technologies. The solution offers an industry leading WebUI, and serves as the foundation for multiple commercial products.
https://invoke-ai.github.io/InvokeAI/
Apache License 2.0
23.83k stars 2.45k forks source link

[bug]: problems with SD after installing CUDA? #5305

Closed atimogus closed 11 months ago

atimogus commented 11 months ago

Is there an existing issue for this?

OS

Windows

GPU

cuda

VRAM

12

What version did you experience this issue on?

SD 1.5

What happened?

i have had this arguments before and SD was working fine

set COMMANDLINE_ARGS=--medvram --xformers --force-enable-xformers --always-batch-cond-uncond --opt-channelslast --no-hashing --disable-nan-check --api --xformers-flash-attention --opt-split-attention --no-half-vae
set PYTORCH_CUDA_ALLOC_CONF=garbage_collection_threshold:0.6, max_split_size_mb:32

i was downloading different versions of CUDA because i needed it for tensorflow project (i realised that tf doesnt work on win anymore) so i started running tf on WSL and now i have newest CUDA on my pc. After 2 months of not using SD i opened it again and tried to generate few picutes, i was getting this error (with args i mentioned above)

RuntimeError: mat1 and mat2 must have the same dtype

so i tried removing all arguments then i can see in preview that image is generating but i get this output

NansException: A tensor with all NaNs was produced in Unet. This could be either because there's not enough precision to represent the picture, or because your video card does not support half type. Try setting the "Upcast cross attention layer to float32" option in Settings > Stable Diffusion or using the --no-half commandline argument to fix this. Use --disable-nan-check commandline argument to disable this check.

then i added --no-half

Snimka zaslona 2023-12-17 003601 after this picture i tried other sampling methods still the same (but with euler sampling i get: NansException: A tensor with all NaNs was produced in Unet. Use --disable-nan-check commandline argument to disable this check.)

now i try this arguents set COMMANDLINE_ARGS= --no-half-vae --no-half --disable-nan-check

Snimka zaslona 2023-12-17 001557

i tried changing models, it did not fix problem, did upcast cross attention layer to float32" option in Settings i have the same generated pictures

GPU: RTX 3060 12GB Vram every time i changed arguments i deleted venv i alseo tried installing new nvidia drivers but now i tink --xformers doesnt work but that is not main problem

Screenshots

No response

Additional context

No response

Contact Details

No response

Millu commented 11 months ago

This look like an issue with A111 - you'll have more luck creating an issue there (this is the InvokeAI repository).