lucidrains / deep-daze

Simple command line tool for text to image generation using OpenAI's CLIP and Siren (Implicit neural representation network). Technique was originally created by https://twitter.com/advadnoun
MIT License
4.37k stars 326 forks source link

torch wont use my GPU or so i think. #161

Open batbeat opened 3 years ago

batbeat commented 3 years ago

I'm not sure why but I don't think torch is using my GPU (RTX 2080 super) it gives me this error

error: :\Users\beaty\AppData\Local\Programs\Python\Python39\lib\site-packages\torch\cuda\amp\grad_scaler.py:115: UserWarning: torch.cuda.amp.GradScaler is enabled, but CUDA is not available. Disabling. warnings.warn("torch.cuda.amp.GradScaler is enabled, but CUDA is not available. Disabling.")

Any help here?

eeymae commented 3 years ago

I had to re-install PyTorch with CUDA enabled. pip install deep-daze didn't get me a version with it enabled for some reason.

I first checked if torch is happy with CUDA by opening a python terminal and entering: import torch torch.cuda.is_available() This returned false for me, hinting that whatever PyTorch I had didn't have CUDA support.

I then checked which CUDA version my GPU supported by running nvidia-smi at a command prompt. The version will be in the top right of the displayed results.

Go to the PyTorch website and select the right options for your installation, pip, conda etc and select the CUDA Compute Platform version. Even though my GPU was using version 11.4, the 11.1 option worked fine. My working options were:

Then copy the command that the website gives you into a command prompt and then wait. When I ran imagine again all was good. torch.cuda.is_available() now returns true also.

Guo-wenkang commented 1 month ago

hello my computer operation WIN,and GPU(RTX 3060),py(3.8),torch(12.4) Over and over again, I've downloaded it many times, but I still can't use it, is there any easy way to do it, a newbie