Closed anothertal3 closed 11 months ago
Nevermind. I somehow didn't realize that there was a dataset configuration difference (resolution 704 vs. 768) which made all the difference.
Although, I still feel that training got slower within the last week. But that would be strange as the trainer didn't change...
Anyway, I'm closing this entry.
Until just now I've been using an earlier version of the Lora-Trainer (approx. 232294d4598fe3325cd968ff3762c9e888607677 ). Using T4 in Colab the following would take me around 25-30 minutes:
I've now switched to the most recent version and executed pretty much the identical configuration with only the following changes (due to new defaults):
The same training process using T4 will now take approx. 50 minutes.
Is this to be expected?