Closed tomaarsen closed 1 day ago
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
Can you confirm it fixes your issue?
It does indeed. I think I can write a test to go alongside this PR - it doesn't need to be ST-specific.
This test roughly mirrors my use case and should act as an effective test to make sure ST should work. Feel free to update it if e.g. you don't normally use Accelerator
in the data_loader
tests.
What does this PR do?
When training with the
transformers
Trainer or related (Sentence Transformers, SpanMarker, SetFit, etc.),set_epoch
is called on the dataloader. This is propagated down to thedataloader.batch_sampler.sampler
if that has aset_epoch
, but not todataloader.batch_sampler
.This prevents epoch-specific generator seeding in custom batch samplers, such as the ones that are common in Sentence Transformers: https://sbert.net/docs/package_reference/sentence_transformer/sampler.html
See also https://github.com/UKPLab/sentence-transformers/issues/3069 by @antigregory which showed that
set_epoch
in my Batch Samplers is not called like I expected them to be.Before submitting
Who can review?
@muellerzr (p.s. get better soon!)