Lyken17 / pytorch-OpCounter

Count the MACs / FLOPs of your PyTorch model.
MIT License
4.9k stars 528 forks source link

Why bias is not considered in count_linear()? #136

Closed xuefei1 closed 3 years ago

xuefei1 commented 3 years ago

This is how nn.Linear FLOPs are counted in the current version:

def count_linear(m, x, y):
    # per output element
    total_mul = m.in_features
    # total_add = m.in_features - 1
    # total_add += 1 if m.bias is not None else 0
    num_elements = y.numel()
    total_ops = total_mul * num_elements
    m.total_ops += torch.DoubleTensor([int(total_ops)])

Link: https://github.com/Lyken17/pytorch-OpCounter/blob/master/thop/vision/basic_hooks.py#L131-L140

I noticed that it's not considering whether the Linear layer has bias, although adding the bias probably won't change the overall FLOPs a lot but still we should check for #the bias in the function right?

Lyken17 commented 3 years ago

See https://github.com/Lyken17/pytorch-OpCounter/tree/master/benchmark for the discussion

Basically, for the simplicity, thop only considers multiplication