Closed SnekCode closed 1 year ago
Yes, however, if you don't want to run the server locally, you don't need the CUDA dependencies, that's why I didn't include it in the requirements.
Ahh ok that makes sense. For the complete SD noobs like myself would it help to add that distinction to the readme? And for documentation purposes for folks that want to run locally they should run pip install torch==1.12.1+cu113 -f https://download.pytorch.org/whl/torch_stable.html
?
Added some documentation in the README.
Added some documentation in the README.
Ahh you beat me to it! Thanks!
Hello,
Installed all dependencies using
pip install -r requirements.txt
but receive this error:AssertionError: Torch not compiled with CUDA enabled
I assume line 8
torch>=1.12.1
in the requirement.txt is incorrect and should be something liketouch>=1.12.1=py38_cu113
. Haven't tested this yet nor I'm I confident those are the correct versions for a cuda compiled version of torch. Just posting incase you have a quicker answer!