pytorch / pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration
https://pytorch.org
Other
84.4k stars 22.73k forks source link

[Feature Request] Any plan to add 'Sparse Convolution' as default nn module? #64544

Open demul opened 3 years ago

demul commented 3 years ago

🚀 Feature

I suggest torch teams to add 'Sparse Convolution' as default nn module.

Motivation

I want to use Sparse Convolution without external libraries like 'SPConv', 'Mincowski Engine', etc... Compiling these libraries without 'War against Error' makes me burned out everytime.

Pitch

I want that pytorch have Sparse Convolution operation in default nn module at least its contrib branch.

cc @nikitaved @pearu @cpuhrsch @IvanYashchuk

cpuhrsch commented 3 years ago

Hello @demul,

Thank you for your feature request. Could you detail your use case for these types of convolutions?

Thank you, Christian

demul commented 3 years ago

Hello Christian.

What is Sparse Convolution?

Sparse Convolution is the hashing-style method to accelerate convolution. The result of this operation is completely same with Convolution. But, to work faster on sparse input, it do not use 'im2col' but 'hashing only valid index'.

for detail,

easy-to-read medium

and paper "3D Semantic Segmentation with Submanifold Sparse Convolutional Networks"

Use Case of Sparse Convolution

It works faster on sparse inputs like 3D sparse voxels.

Fig.1 on easy-to-read medium is typical case of Sparse Convolution can work faster than Vanilla Convolution.

MincowskiNet significantly accelerates their 3D-4D Semantic Segmentation Task using Sparse Convolution.

Thank you, demul

cpuhrsch commented 3 years ago

Thank you for the context!