Closed xiaosu-zhu closed 2 months ago
Thank you @xiaosu-zhu. Training VAR-d16 for 200 epochs on ImageNet 256x256 costs 2.5 days on 16 A100s. Training VAR-d30 for 350 epochs on ImageNet 512x512 with progressive training requires 256 A100 for around 4 days.
We'll add these in the new version of paper.
Thanks for your reply. 👍
Great work! I am wondering how much GPU time does this work require for training across different models? I can't find descriptions in paper.