raoyongming / GFNet

[NeurIPS 2021] [T-PAMI] Global Filter Networks for Image Classification
https://gfnet.ivg-research.xyz/
MIT License
448 stars 42 forks source link

Memory and FLOPs concern? #20

Closed techmonsterwang closed 1 year ago

techmonsterwang commented 1 year ago

Hi! very interesting work!

How is Params calculated? Do you use profile ?

I have noticed that you use a script to calculate memory and flops. Can you share the script? Many thanks goes to the author~

raoyongming commented 1 year ago

Hi, thanks for your interest in our work. You can use the script provided in our HorNet repo to compute FLOPs, which is also compatible with GFNet models. For memory usage, we use torch.cuda.max_memory_allocated to measure the max GPU memory usage during training.

techmonsterwang commented 1 year ago

Very thanks for the nice response!

I have another question about the parameters~

Do you use "flops, params = profile(model, inputs=(input, ))" to calculate the params (number of parameters)?

Maybe the parameter K (learnable weight) is not taken into consideration?