lukasruff / Deep-SVDD-PyTorch

A PyTorch implementation of the Deep SVDD anomaly detection method
MIT License
698 stars 197 forks source link

About loss function #8

Closed liujiyuan13 closed 5 years ago

liujiyuan13 commented 5 years ago

In paper, "Deep One-Class Classification", the loss function goes as image However, the loss is calculated as image They are different in these two places. The loss in this code lacks of the weights of neural network. Is it important or considered in another place?

rsaite commented 5 years ago

The last term in the loss function is commonly referred to as weight decay and it is considered in another place. Its strength (lambda) is set through the parameters --weight_decay (and for pretraining --ae_weight_decay) which are passed to the optimizer.

liujiyuan13 commented 5 years ago

The last term in the loss function is commonly referred to as weight decay and it is considered in another place. Its strength (lambda) is set through the parameters --weight_decay (and for pretraining --ae_weight_decay) which are passed to the optimizer.

Thanks very much for your reply! I've found the corresponding code for the weights of neural network.

lukasruff commented 5 years ago

As @rsaite pointed out, PyTorch makes it simple to add weight decay regularization on the network weights via the optimizer. Just to add the specific lines for reference:

Deep SVDD trainer https://github.com/lukasruff/Deep-SVDD-PyTorch/blob/5d7195dfff37efdaccd337257484d9d3db464730/src/optim/deepSVDD_trainer.py#L49

Autoencoder trainer for pretraining https://github.com/lukasruff/Deep-SVDD-PyTorch/blob/5d7195dfff37efdaccd337257484d9d3db464730/src/optim/ae_trainer.py#L30