This paper introduces a novel loss function, tangent loss (TG), which is applied for classification tasks to improve the stability of neural networks. Experiments on real-world datasets show that TG loss can achieve better or comparable results compared to the traditional cross-entropy (CE) loss, while providing more stability.
Key Points
Recently, some researchers have focused on loss functions to reduce intra-class diversifications and enlarge inter-class differences, where softmax cross-entropy loss and some variants have been applied.
Existing loss function works mainly focus on the modification and adaptation of the original cross-entropy loss function, which inevitably leads to an implicit problem of cross-entropy that is too sensitive to the randomness and outliers in training datasets. Leads to excessive sensitivity and overfitting
Tangent loss, new form of loss function.
Citation
Xu Zhang, Wenpeng Lu, Yan Pan, Hao Wu, Rongyao Wang, Rui Yu,
Empirical study on tangent loss function for classification with deep neural networks,
Computers & Electrical Engineering,
Volume 90,
2021,
107000,
ISSN 0045-7906,
https://doi.org/10.1016/j.compeleceng.2021.107000.
Title
URL
https://www.sciencedirect.com/science/article/pii/S0045790621000276
Summary
This paper introduces a novel loss function, tangent loss (TG), which is applied for classification tasks to improve the stability of neural networks. Experiments on real-world datasets show that TG loss can achieve better or comparable results compared to the traditional cross-entropy (CE) loss, while providing more stability.
Key Points
Citation
Xu Zhang, Wenpeng Lu, Yan Pan, Hao Wu, Rongyao Wang, Rui Yu, Empirical study on tangent loss function for classification with deep neural networks, Computers & Electrical Engineering, Volume 90, 2021, 107000, ISSN 0045-7906, https://doi.org/10.1016/j.compeleceng.2021.107000.
Repo link