Closed daniel4x closed 9 months ago
Thanks, that's a good catch - I was usually running fine-tuning with the run_many
script to schedule several repeats with different seeds to get mean/std, but it can work in the single run as well
It was a copy-paste mistake 😄 .
I also missed the actual slicing from the for loop over the train data so the actual batch_per_epoch is still len(train_loader)
as the variable is not used (it was only used for the print).
# run.py training loop
for batch in islice(train_loader, batch_per_epoch):
....
I'll create another patch for that, wdyt? @migalkin
Sure, go ahead
Allow to specify the batch per epoch in the finetune script (Probably missed it?)