yinboc / liif

Learning Continuous Image Representation with Local Implicit Image Function, in CVPR 2021 (Oral)
https://yinboc.github.io/liif/
BSD 3-Clause "New" or "Revised" License
1.28k stars 145 forks source link

True number of epochs (200 or 1000)? #36

Closed nivha closed 3 years ago

nivha commented 3 years ago

Hi, very cool work! In your paper you write that you train for 200 epochs, but in the config files (included in this repo) you have 1000 epochs. Should there be a big difference between the two options? In terms of runtime it matters a lot... 25 vs 5 hours training time. I wonder if the final quality also changes. Thanks!

DongHwanJang commented 3 years ago

@nivha Did it take only 25 hours to train the model for 1000 epochs? May I ask what gpu you used? By the way, accroding to the paper, it seems like they used 1000 epochs for DIV2K dataset, while using 200 epochs for CelebA dataset.

nivha commented 3 years ago

Hi @DongHwanJang it actually took about 31 hours to train the model for 1000 epochs. When I wrote the question it only started to train, and since it was not the point of my question I accidentally thought it's 1.5 min per epoch, when it's more like 2 mins per epoch. I used RTX8000. You're right about the 1000 epochs! thanks

xiximelon commented 1 year ago

@nivha hello,may i ask which .yaml you use to train, because one epoch costs about 7 min on RTX3090. Thanks !

nivha commented 1 year ago

@xiximelon Hi, it was two years ago.. I think it was on Div2K on RTX8000 as I wrote. But unfortunately I don't remember more then that

xiximelon commented 1 year ago

@xiximelon Hi, it was two years ago.. I think it was on Div2K on RTX8000 as I wrote. But unfortunately I don't remember more then that

Thanks for reply!