UKPLab / sentence-transformers

State-of-the-Art Text Embeddings
https://www.sbert.net
Apache License 2.0
15.44k stars 2.5k forks source link

`CrossEncoder` is not pushed to cuda until predict is called, even if cuda is specified as device. #3078

Open susnato opened 1 week ago

susnato commented 1 week ago

Hi, this is more like a question rather than a bug or issue.

When I specify the target device during initialization of any CrossEncoder, the model is not pushed to that device until the predict or the fit method is called, until then the model is kept in cpu.

from sentence_transformers import CrossEncoder

model2 = CrossEncoder("mixedbread-ai/mxbai-rerank-large-v1", device="cuda:0")
print(model2.model.device)
# cpu

I mean I expect the model to be pushed to the specified device during initialization and until I am calling predict it is taking up my system Ram. Is there any high level reason(s) why this is the case?

pesuchin commented 2 days ago

I’m not sure of the intention behind this implementation, but I think it’s because the following code within the fit function is where the data is first transferred to the GPU.

https://github.com/UKPLab/sentence-transformers/blob/df6a8e8278b49e7ca01401f46799610106a7b640/sentence_transformers/cross_encoder/CrossEncoder.py#L235

susnato commented 2 days ago

yes, but until I call fit or predict my model is kept in the cpu, which is inconvenient IMO and also takes up the ram.

tomaarsen commented 2 days ago

Hello!

Apologies for the delay. This was a design decision made by my predecessor, it was also the case for Sentence Transformer models, but it has been updated there (See #2351) as I believe it's better to immediately move the model to the desired device.

I'll fix this when I start updating cross-encoders soon, although I'm also open to a PR much like #2351 in the meantime.

susnato commented 2 days ago

Hello @tomaarsen! thanks for the response!

I would like to create the PR to fix this, could you please assign this to me?

tomaarsen commented 2 days ago

Gladly!