raoyongming / DynamicViT

[NeurIPS 2021] [T-PAMI] DynamicViT: Efficient Vision Transformers with Dynamic Token Sparsification
https://dynamicvit.ivg-research.xyz/
MIT License
551 stars 69 forks source link

Flops tools #19

Closed waynelrs closed 2 years ago

waynelrs commented 2 years ago

Hi, it is wonderful and solid work. I have several questions about Flops. In your paper, you compute the model's flops (the unit is Gflops). Which package can compute the Gflops? The popular package from ptflops import get_model_complexity_info as I know, its unit is output as MAc not Gflops. Thanks in advance.

raoyongming commented 2 years ago

Hi, thanks for your interest in our work. I think fvcore is a good tool to compute the model's flops. They provide a operation-level tool that can correctly compute the complexity of self-attention layers.

waynelrs commented 2 years ago

Hi, thanks for your interest in our work. I think fvcore is a good tool to compute the model's flops. They provide a operation-level tool that can correctly compute the complexity of self-attention layers.

Thanks!