omni-us / research-seq2seq-HTR

MIT License
20 stars 13 forks source link

I am also trying to train on IAM dataset..But its taking too much time ..which GPU have u used and for how many epochs #2

Closed thelastfunction closed 5 years ago

thelastfunction commented 5 years ago

Hi in https://github.com/omni-us/research-seq2seq-HTR/blob/e5de55d64ce68ef6e7f9128baa4e5f63bfe48897/main_torch_latest.py#L276 the number of epochs is around 5 million , It will take weeks to train on my GPU (Nvidia 1080 ti )

leitro commented 5 years ago

Hi! It's not running until the maximum epoch, as we have an early stopping mechanism. Please refer to Line 309 in the same file you found. Cheers!

thelastfunction commented 5 years ago

@littlethunder thanks for clearing my confusion..:)