Open wolfgangmeyers opened 2 years ago
Yeah, looks like it's running on the cpu. Might dig further to see if I can find out why.
Seems related to https://github.com/google/jax/issues/5231
Finally managed to get it to work. Needed to do the following:
Install cuDNN: https://developer.nvidia.com/rdp/cudnn-download
This threw me for a bit. It drops a bunch of .deb files in /var/cudnn-local-repo-ubuntu2004-8.4.1.50
. I needed to install a .deb from that folder.
Then I followed installation instructions for jax here: https://github.com/google/jax#installation
Did you get any GPU usage when running? I had the same issue, was able to fix it (error went away, both jax and pytorch identified my GPU and said it was available) but when running the model there was no GPU usage, CPU usage maxed out on a single core and it took as long as before
Well I think I can conclude that my GPU usage (at least in task manager) is a bit scuffed, since playing RDR2 doesn't show any considerable increase in usage
Getting the following output when trying to run on WSL. I've followed all of the steps that should make it work, pytorch seems to think there is GPU support? Not sure if this is a bogus error or not.