Open Matthieu-Tinycoaching opened 2 years ago
Try lowering the batch size. Setting batch_size_gpl=16 might work but will take longer to run.
The reason for this is the amount of GPU RAM consumed by each batch. 32 examples at once might be too big, but 16 might fit.
How to use multiple GPUs in this training script?
Cross-encoder and sentence-transformer does not officially support multi-GPU training, though there are some (very old) forks that have been experimenting with this feature: https://github.com/UKPLab/sentence-transformers/pull/1215
I'm getting the same error even when setting batch_size_gpl=1
and batch_size_generation = 1
. I don't get the problem when I set gpl_steps=1
however, which seems odd? I'm using passages which average 500 tokens.
Hi,
When trying to generate intermediate results with the following command:
I got the following error:
My corpus consists of small paragraphs of 3-4 lines and I used
use_amp
option. How could I deal with it?