PINTO0309 / Keras-OneClassAnomalyDetection

[5 FPS - 150 FPS] Learning Deep Features for One-Class Classification (AnomalyDetection). Corresponds RaspberryPi3. Convert to Tensorflow, ONNX, Caffe, PyTorch. Implementation by Python + OpenVINO/Tensorflow Lite.
https://qiita.com/shinmura0
MIT License
128 stars 31 forks source link

Interogations concerning compactloss #5

Closed Sylv-Lej closed 3 years ago

Sylv-Lej commented 3 years ago

Hi,

I've been studied DOC for a while now and i find something weird :

is original_loss a custom loss or is it an implementation of the paper's loss ?

As written in the original paper :

K.sum((y_pred -K.mean(y_pred,axis=0))**2,axis=[1])

must be the average sample variance

but

K.var(y_pred, axis=1)

is quite different causing very different loss than expected

Am I missing something ?

shinmura0 commented 3 years ago

As written in the original paper : K.sum((y_pred -K.mean(y_pred,axis=0))**2,axis=[1])

In implementation(from_preprocessing_to_training.ipynb), the code is same above. And this code is based on the paper.

image

K.var(y_pred, axis=1)

Where is this? I cannot search this code in the implementation.

Sylv-Lej commented 3 years ago

I understand this

you wrote this : lc = 1/(classes*batchsize) * batchsize**2 * K.sum((y_pred -K.mean(y_pred,axis=0))**2,axis=[1]) / ((batchsize-1)**2)

i understand this part that is the one written in the paper 1/(classes*batchsize) * K.sum((y_pred -K.mean(y_pred,axis=0))**2,axis=[1])

but not the multiplication by this batchsize**2 / ((batchsize-1)**2)

and why is (y_pred -K.mean(y_pred,axis=0))**2 squared ?

Sylv-Lej commented 3 years ago

K.var(y_pred, axis=1)

Has never been in your code, i wrote it cause of

Capture d’écran 2021-02-17 à 11 26 57
shinmura0 commented 3 years ago

I remembered a day of implementation. My code was based on this equation( appendix). image

And your proposal

K.var(y_pred, axis=1

is possible to do so.