Closed NeverInAsh closed 6 months ago
If a GPU is available, it automatically uses it. Otherwise you can pass a device parameter when loading the model
the code could be like this: cross_encoder = CrossEncoder(model_path, device='cuda:0')
Indeed, that should work. I'll close this!
Thanks for developing this amazing library. I trained a marco-passage re-ranking model using sentence transformer. Now, I want to load it using CrossEncoder.
But I was not able to set it up for GPU inference.
covid_marco = CrossEncoder("./pre_trained_model/training_medmarco_covidbert")
Check if GPU is available and use it
if torch.cuda.is_available(): covid_marco = covid_marco.to(torch.device("cuda")) print(covid_marco.device)
This throws an error.
AttributeError Traceback (most recent call last)