utterworks / fast-bert

Super easy library for BERT based NLP models
Apache License 2.0
1.86k stars 341 forks source link

How to use cuDNN device with fast-bert #315

Closed elinork closed 1 year ago

elinork commented 1 year ago

Hello, I'm trying to train a model on an Ubuntu image with cuDNN support, but I can't seem to enter the device properly. torch.cuda.is_availlable is False but torch.backends.cudnn.enabled is True. I've tried starting over with various versions of PyTorch but that always seems to be the behavior I get. I can't specify a device "cudnn", though. Any ideas? image

lingdoc commented 1 year ago

what is the output of nvidia-smi? if there is no output then you need to install the nvidia driver appropriate to your system and whatever version of cuda you're running. check https://docs.nvidia.com/deeplearning/cudnn/support-matrix/index.html for more info on the appropriate toolkit version etc. keep in mind that enabling the backend doesn't actually mean that cuda is available, and the error message suggests it may not be installed properly (if at all).

it can also be tricky to get the right combination to work with your particular PyTorch version, but there are some good blog posts out there on this. a google search based on your particular setup should point you in the right direction.

elinork commented 1 year ago

Thanks - this issue is almost certainly something unusual about my environment rather than this package, so I'll go ahead and close this out. I'll try and research that I've got the correct PyTorch version.