OpenGVLab / Vision-RWKV

Vision-RWKV: Efficient and Scalable Visual Perception with RWKV-Like Architectures
https://arxiv.org/abs/2403.02308
Apache License 2.0
371 stars 14 forks source link

Request for Method to Calculate FLOPs and Params for vision-rwkv Model #12

Closed xiaojieli0903 closed 7 months ago

xiaojieli0903 commented 7 months ago

Hello,

Could you provide a method to calculate the FLOPs and parameters for the vision-rwkv model? I have attempted to use the get_flops tool from mmpretrain, but encountered the following error:

File "mmpretrain/mmcls_custom/models/backbones/vrwkv.py", line 217, in forward
    x = _inner_forward(x)
File "mmpretrain/mmcls_custom/models/backbones/vrwkv.py", line 208, in _inner_forward
    x = RUN_CUDA(B, T, C, self.spatial_decay / T, self.spatial_first / T, k, v)
File "mmpretrain/mmcls_custom/models/backbones/vrwkv.py", line 90, in RUN_CUDA
    return WKV.apply(B, T, C, w.cuda(), u.cuda(), k.cuda(), v.cuda())
File "/home/miniconda3/envs/openmmlab/lib/python3.8/site-packages/torch/autograd/function.py", line 506, in apply
    return super().apply(*args, **kwargs)  # type: ignore[misc]
RuntimeError: _Map_base::at

Any help or guidance on how to resolve this would be greatly appreciated.

Thank you!

duanduanduanyuchen commented 7 months ago

@xiaojieli0903 Hi, the script for calculating the FLOPs and parameters is uploaded(get_flops_backbone.py). You can use it as follows:

cd Vision-RWKV/classification
python get_flops_backbone.py configs/vrwkv/vrwkv_tiny_8xb128_in1k.py
xiaojieli0903 commented 7 months ago

@duanduanduanyuchen

Thank you for providing the tool. I have successfully tested it.