marco-rudolph / differnet

This is the official repository to the WACV 2021 paper "Same Same But DifferNet: Semi-Supervised Defect Detection with Normalizing Flows" by Marco Rudolph, Bastian Wandt and Bodo Rosenhahn.
215 stars 67 forks source link

Why the train loss is negative ? #14

Closed nxm19830306 closed 3 years ago

nxm19830306 commented 3 years ago

I use the default configuration with MVTec AD data, but the train loss I got is negative. Thanks for your answer!

Train epoch 0 THCudaCheck FAIL file=/pytorch/aten/src/THC/THCGeneral.cpp line=405 error=11 : invalid argument Epoch: 0.0 train loss: -0.0312 Epoch: 0.1 train loss: -0.9794 Epoch: 0.2 train loss: -1.1155 Epoch: 0.3 train loss: -1.2642 Epoch: 0.4 train loss: -1.3475 Epoch: 0.5 train loss: -1.4397 Epoch: 0.6 train loss: -1.5100 Epoch: 0.7 train loss: -1.5529

Compute loss and scores on test set: Epoch: 0 test_loss: -1.5800 AUROC: last: 0.9106 max: 0.9106 epoch_max: 0

marco-rudolph commented 3 years ago

This is absolutely normal. The loss indicates the negative log likelihood, which can be quite negative.

https://github.com/marco-rudolph/differnet/issues/10#issuecomment-738085401