sosuperic / MeanSum

Other
112 stars 51 forks source link

Running MeanSum without CUDA #19

Open EricvanSchaik opened 2 years ago

EricvanSchaik commented 2 years ago

Hi, I am trying to run MeanSum on a cluster where cuda is not available, so when loading the language model, it fails when trying to deserialize and gives `AttributeError: module 'torch._C' has no attribute '_cuda_getDevice', which I guess is expected since cuda is not enabled. Is it possible to run MeanSum with just CPUs?