charigyang / itsabouttime

Code repository for "It's About Time: Analog clock Reading in the Wild"
MIT License
66 stars 10 forks source link

training time #2

Open catherine-qian opened 2 years ago

catherine-qian commented 2 years ago

Dear authors,

Thanks for this wonderful work! Could you please let me know your training time (e.g. at how many GPUS) and the total epoch number? Did you set an early stop scheme?

charigyang commented 2 years ago

Hi, sorry for late reply.

We train on a single GPU, about a day each round, 100k steps on synthetic data (though 20k would suffice), and 20k when finetuning, and no we didn't stop early or vary learning rate.