jacobkimmel / pytorch_modelsize

Estimates the size of a PyTorch model in memory
MIT License
354 stars 49 forks source link

Cannot compute the size of a model with multiple layer types #2

Closed Sanghyun-Hong closed 6 years ago

Sanghyun-Hong commented 6 years ago

This tool does not work if we have a model with multiple layer types. As a simple example, if we use the MNIST sample Link, this script returns the error.

Need to be fixed as there's no model with only one layer type. [Fixed in here as well]

jacobkimmel commented 6 years ago

The tool fails because a dimensionality change occurs outside a module in forward. (See the call to view. There is no current way I know of to identify arbitrary dimensionality changes not performed by a module.

This is noted in the README. On Tue, Aug 7, 2018 at 20:09 Sanghyun Hong (Albert) < notifications@github.com> wrote:

This tool does not work if we have a model with multiple layer types. As a simple example, if we use the MNIST sample Link https://github.com/pytorch/examples/tree/master/mnist, this script returns the error.

Need to be fixed as there's no model with only one layer type. [Fixed in here as well: https://discuss.pytorch.org/t/gpu-memory-estimation-given-a-network/1713]

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/jacobkimmel/pytorch_modelsize/issues/2, or mute the thread https://github.com/notifications/unsubscribe-auth/ACWdxWpSE_Yd-KuTx9aT-Q-v4clPigMhks5uOlaBgaJpZM4VzNCJ .