Batches with max_length of 5 in the Transformers training loop
Actual Behavior:
Batches whose # of tokens is equal to the longest token length in the batch
Possible Culprits:
This function is responsible for propagating SetFit training arguments to the SentenceTransformer Trainer. args.max_length is not referenced in this method, nor does it look like it's supported anyway as a parameter to the Trainer
Description: I don't believe the
max_length
parameter ofTrainingArguments
is actually being used.Minimal working example (borrowed from quickstart):
Expected Behavior:
Actual Behavior:
Possible Culprits:
args.max_length
is not referenced in this method, nor does it look like it's supported anyway as a parameter to the TrainerEnvironment Info:
Happy to look into / propose a fix if appropriate!