Closed InstantWindy closed 6 years ago
I have rewrite code using PyTorch. It looks that there are some tools to calculate GFLOPS in PyTorch.
The less the parameter quantity of the model, the shorter the inference time of the model?
Generally speaking, Yes. Please take a look at my article about more detail. Real-Time deep learning in mobile application
How do you calculate the GFLOPS of a model? The less the parameter quantity of the model, the shorter the inference time of the model?