mit-han-lab / torchsparse

[MICRO'23, MLSys'22] TorchSparse: Efficient Training and Inference Framework for Sparse Convolution on GPUs.
https://torchsparse.mit.edu
MIT License
1.19k stars 139 forks source link

Tanh implementation #222

Closed ErinTUDelft closed 1 year ago

ErinTUDelft commented 1 year ago

Thank you for creating this nice engine! I was wondering if other activation functions -such as tanh- will be added in the future, as right now only ReLU and leaky ReLU seem to be available.

Kind regards, Erin

zhijian-liu commented 1 year ago

You may follow the implementation at https://github.com/mit-han-lab/torchsparse/blob/master/torchsparse/nn/modules/activation.py to implement other activation functions.