2han9x1a0release / RBF-Softmax

MIT License
37 stars 7 forks source link

RBF-Softmax

RBF-Softmax is a simple but effective image classification loss function of deep neural networks. This RBF-Softmax project written in PyTorch and modified from pycls.

In RBF-Softmax, logits are calculated by RBF kernel and then scale by a hyperparameter. So here the weights in last FC are treated as class prototypes.

RBF-Softmax Pipeline

The MNIST toy demo visualization of RBF-Softmax and other losses.

RBF-Softmax

Following gif is the 2D feature visualization of RBF-Softmax trained on MNIST. With the training conducting, the inner class distances become smaller and smaller.

Feature Vis.

Introduction

Training and Testing RBF-Softmax

Model Zoo

We provide a some final results and pretrained models available for download in the Model Zoo. Note in paper we report the best top-1 performances during whole training.

ImageNet Results The best top-1 error of different models trained with RBF-Softmax.

Citing

If you find RBF-Softmax helpful in your research, please consider citing:

@InProceedings{xzhang2020rbf,
  title = {RBF-Softmax: Learning Deep Representative Prototypes with Radial Basis Function Softmax},
  author = {Zhang, Xiao and Zhao, Rui and Qiao, Yu and Li, Hongsheng},
  booktitle = {ECCV},
  year = {2020}
}

License

RBF-Softmax is licensed under the MIT license. Please see the LICENSE file for more information.