Open ThorinWolf opened 2 years ago
Instead of renaming your local files, try changing this line: https://github.com/ahrm/UnstableFusion/blob/91d40dd7a085c8937ef7b7d1478c54ca50a7850d/diffusionserver.py#L34
to this:
"CompVis/stable-diffusion-v1-4",
and see if it helps.
Done, but no effect. Exact same error. And Torch reserving memory but not clearing it properly with each crash means I get out of memory errors if I try to do it for a second time without restarting.
I assume it is a memory error? Maybe try enabling attention slicing and see if it helps: https://github.com/ahrm/UnstableFusion/pull/34/commits/65f95a6baa1d90061ed9cd16cf58e3994c52634b
No change, though it was faster to get to starting the generation. Monitoring CPU, memory, GPU & GPU memory via both htop & nvtop. No topping out on either of them, so I presume is not a memory issue.
What is your cuda version?
NVCC:
$ nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2022 NVIDIA Corporation
Built on Wed_Sep_21_10:33:58_PDT_2022
Cuda compilation tools, release 11.8, V11.8.89
Build cuda_11.8.r11.8/compiler.31833905_0
Pytorch doesn't support cuda 11.8
, the maximum supported version is 11.6
.
Downgraded to:
$ nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2021 NVIDIA Corporation
Built on Fri_Dec_17_18:16:03_PST_2021
Cuda compilation tools, release 11.6, V11.6.55
Build cuda_11.6.r11.6/compiler.30794723_0
Same error occurs. Can try and downgrade further if required, but it is 11.6.
OS: Arch Linux rolling GPU: GTX 1660 SUPER Driver: nvidia-520.56.06 CUDA: cuda-tools installed
Whenever I go to generate, it crashes on a CUDA error, as shown below. I have all dependencies listed plus a fair few others since it also pointed out I didn't have them. I'm using a local clone of v1.4 of the diffusion model renamed to v1.5 because the program wants to only accept a v1.5 folder despite it not being out for the public (as far as I can tell) and the HTTPX request fails when I go for the access key.
See terminal output below (the backslashes in the last line are to be ignored, interfered with code block):