TACJu / TransFG

This is the official PyTorch implementation of the paper "TransFG: A Transformer Architecture for Fine-grained Recognition" (Ju He, Jie-Neng Chen, Shuai Liu, Adam Kortylewski, Cheng Yang, Yutong Bai, Changhu Wang, Alan Yuille).
MIT License
384 stars 87 forks source link

About trainint details #28

Open JingjunYi opened 2 years ago

JingjunYi commented 2 years ago

Hi, thanks for your great work. I want to know how many epochs/steps have you trained on those benchmarks. Thanks again!

20713 commented 2 years ago

Hi, thanks for your great work. I want to know how many epochs/steps have you trained on those benchmarks. Thanks again!

It is also my question. I found the epoch set in the code is 10000...Amazing.....

Jing--Li commented 2 years ago

10000 is the number of iterations,not epochs!

895318 commented 1 year ago

10000 is the number of iterations,not epochs!

May I ask where to set epoch in the code?

Jing--Li commented 12 months ago

10000 is the number of iterations,not epochs!

May I ask where to set epoch in the code?

It seems that It only used iteartions. You can estimate epoches based on iterations and batch-size.