davidtvs / PyTorch-ENet

PyTorch implementation of ENet
MIT License
389 stars 129 forks source link

Question about the params and FLOPs of ENet #44

Open mrzhouxixi opened 4 years ago

mrzhouxixi commented 4 years ago

Does anyone reproduce the ENet?Why the params and GFLOPs of my reproduced Network are about 10 and 4 times bigger than the values mentiond in the original paper (Table 3) respectively? My calculated value—— params: 3.5Million,GFLOPs: 16.9

davidtvs commented 4 years ago

I just used THOP to confirm the number of parameters and GFLOPS for the same input size that's given in table 3 from the paper (3 x 640 x 360):

from models.enet import ENet
from thop import profile

model = ENet(12).to('cpu')
input = torch.randn(1, 3, 640, 360)
flops, num_parameters = profile(model, (input,), verbose=False)

I got 2.2 GFLOPS and 0.35Milion parameters. There's very little difference in the number of parameters but a significant difference in the number of FLOPS that I could look into.

How did you find your calculated values?

rashedkoutayni commented 3 years ago

For 3x224x224 input: Computational complexity: 0.48 GMac Number of parameters: 350.65 k

For 3x640x360 input: Computational complexity: 2.2 GMac Number of parameters: 350.65 k

This profiling was done using ptflops.