aojunzz / NM-sparsity

212 stars 29 forks source link

Results layout #11

Closed priyankaankolekar closed 2 years ago

priyankaankolekar commented 2 years ago

Hello, I suppose the results in the Model Zoo for Classification are the NCHW format. How do you convert them to the NHWC format (also mentioned in the README in the Classification folder)? Can this conversion be done while running the training i.e. in the train_imagenet.py script itself? Thanks in advance for your help.

aojunzz commented 2 years ago

@priyankaankolekar Hi,

the default format is NHWC, you can refer to Line 76 https://github.com/NM-sparsity/NM-sparsity/blob/main/devkit/sparse_ops/sparse_ops.py#:~:text=return%20Sparse_NHWC.apply(self.weight%2C%20self.N%2C%20self.M)

priyankaankolekar commented 2 years ago

So are the results of Model Zoo here (https://github.com/NM-sparsity/NM-sparsity/tree/main/classification, Table titled "Results and Model Zoo") are in NHWC format? What is the difference between the tables "Converting the the NCHW format in Pytorch to NHWC in ASP" and "Results and Model Zoo"? Thanks.

aojunzz commented 2 years ago

@priyankaankolekar yes, the results in model zoo are NHWC format, "the results and model zoo" table is original paper results, after the paper published, the nvidia asp released source code, we noticed that the ASP method prune dense model using NCHW format, so we used our method to prune the dense model with NCHW format, the result shown in "Converting the the NCHW format in Pytorch to NHWC in ASP". according to our experiments, the format NCHW and NHWC had similar performance.

Thanks,

priyankaankolekar commented 2 years ago

Thank you so much for explaining this.

Priyanka.

791136190 commented 11 months ago

@priyankaankolekar yes, the results in model zoo are NHWC format, "the results and model zoo" table is original paper results, after the paper published, the nvidia asp released source code, we noticed that the ASP method prune dense model using NCHW format, so we used our method to prune the dense model with NCHW format, the result shown in "Converting the the NCHW format in Pytorch to NHWC in ASP". according to our experiments, the format NCHW and NHWC had similar performance.

Thanks,

There is such an implementation in nvida's asp code:

# convs
t = t.permute(2,3,0,1).contiguous().view(shape[2]*shape[3]*shape[0], shape[1])
func = getattr(sys.modules[__name__], pattern, None)
mask = func(t, density)
mask = mask.view(shape[2], shape[3], shape[0], shape[1]).permute(2,3,0,1).contiguous()      
return mask.view(shape).type(ttype)

My question is : for pytorch weights tensor layout default is oihw,so nvidia's layout should be hwoi ?