Lyken17 / pytorch-OpCounter

Count the MACs / FLOPs of your PyTorch model.
MIT License
4.82k stars 528 forks source link

got 0 ops for nn.MultiheadAttention #197

Open wangtiance opened 1 year ago

wangtiance commented 1 year ago

My thop version: 0.1.1

Minimal code to replicate:

import thop
import torch.nn as nn
import torch

f = nn.MultiheadAttention(100, 1, batch_first=True)

x = torch.ones((1, 1000, 100))

result = thop.profile(f, (x,x,x))
print(result)

I got (0.0, 0). Is this module not supported yet?

wangtiance commented 1 year ago

Does it only support modules listed here? https://github.com/Lyken17/pytorch-OpCounter/blob/43c064afb71383501e41eaef9e8c8407265cf77f/thop/profile.py#L21

CaA23187 commented 1 year ago

you are right. I guess this module doesn't supported MHSA layer yet.

quancs commented 1 year ago

facing the same problem

HaoKang-Timmy commented 1 year ago

You may use https://github.com/HaoKang-Timmy/torchanalyse or torchprofile for NLP models

HaoKang-Timmy commented 1 year ago

You may use https://github.com/HaoKang-Timmy/torchanalyse or torchprofile for NLP models

These two repo could profile transformers, which I have tried.

CaA23187 commented 1 year ago

You may use https://github.com/HaoKang-Timmy/torchanalyse or torchprofile for NLP models

thank you for your reply. I'll try it!

cooma04 commented 5 months ago

It seems thop does not account for any parameters created with nn.Parameter. For any created with nn.Parameter, it simply yields 0.