marco-rudolph / differnet

This is the official repository to the WACV 2021 paper "Same Same But DifferNet: Semi-Supervised Defect Detection with Normalizing Flows" by Marco Rudolph, Bastian Wandt and Bodo Rosenhahn.
215 stars 67 forks source link

the values of train loss are becoming negative #10

Closed QingL0218 closed 3 years ago

QingL0218 commented 3 years ago

Hi, dear Marco. I have a quesion: when I use this code to train , the train losses from positive to negative , is it right ? I'm confused. Waiting for your reply.

Best Regards!

marco-rudolph commented 3 years ago

This is absolutely normal. The loss indicates the negative log likelihood, which can be quite negative.

QingL0218 commented 3 years ago

ok, thank you very much !