Closed SunGreen777 closed 8 months ago
Добавь в аргументы как и написано --skip-torch-cuda-test
If error after update, Maybe this will help #334
DirectML is now optional. Not a fallback. Please add --use-directml
to skip CUDA test.
--use-directml is ok thanks
DirectML is now optional. Not a fallback. Please add
--use-directml
to skip CUDA test.
this worked for me as well. had previously been using "--backend directml" which is no longer valid.
When I use --use-directml it says "module 'torch' has no attribute 'dml'"
When I use --use-directml it says "module 'torch' has no attribute 'dml'"
I confirm that I received exactly the same error. This morning I received an update after running "webui.bat" with the "git pull" command for version 1.7, and after a series of errors and a complete reinstallation of GIT and Python, which I spent the whole day on, I realized that you simply updated the version of your SD. I believe that the new version is better ^_^, but everything worked very well for me on version 1.6, and therefore I do not need updates, but unfortunately I did not remove the "git pull" command from autorun. Can you please make access to previous versions, or provide a link to download 1.6? I don't know how to fix it, I just know how it just worked before, sorry T____T (If I'm wrong about something, you can correct me)
My PC settings if needed: CPU: QuadCore Intel Core i5-6400 RAM: 32GB Video: Radeon RX 5500 XT 8gb
I would really like to go back to 1.6 if possible as the current version is nothing but problems. I have to add --skip-torch-cuda-test --precision full --no-half to get it do work when I didn't need to before. Now generation time for 1 image is giving me an eta of 1 hour. I've tried a handful of suggestions that don't work and went as far as uninstalling everything and doing a clean install and still have problems.
After adding "torch-directml" to "requirements_versions.txt" and reinstalling venv, Stable Diffusion Web UI works correctly with the --use-directml option.
After adding "torch-directml" to "requirements_versions.txt" and reinstalling venv, Stable Diffusion Web UI works correctly with the --use-directml option.
DirectML is now optional. Not a fallback. Please add
--use-directml
to skip CUDA test.
combine these 2 changes, it works perfectly fine.
During that night I found a copy of this SD 1.6.0 in the "Wayback Machine" If you can't run 1.7, you can try this ^_^ This is a last resort option if all else fails
During that night I found a copy of this SD 1.6.0 in the "Wayback Machine" If you can't run 1.7, you can try this ^_^ This is a last resort option if all else fails
Like a zip file? How did you do that?
Checklist
What happened?
I have 2 configs files 1 set COMMANDLINE_ARGS= --medvram --opt-sdp-no-mem-attention --api and 2 set COMMANDLINE_ARGS=--medvram --no-half --precision full --always-batch-cond-uncond --opt-sub-quad-attention --sub-quad-q-chunk-size 512 --sub-quad-kv-chunk-size 512 --sub-quad-chunk-threshold 80 --disable-nan-check --upcast-sampling --skip-torch-cuda-test --use-cpu interrogate gfpgan scunet codeformer
The first error, see the screenshot, the second does not use the video card. rx570 8gb
Steps to reproduce the problem
skip
What should have happened?
skip
What browsers do you use to access the UI ?
No response
Sysinfo
skip
Console logs
Additional information
skip