lyn1874 / memAE

unofficial implementation of paper Memorizing Normality to Detect Anomaly: Memory-augmented Deep Autoencoder (MemAE) for Unsupervised Anomaly Detection
69 stars 24 forks source link

AUC on UCSD Ped2 #4

Open Markovcom opened 3 years ago

Markovcom commented 3 years ago

I train the models with your codes on UCSD Ped2 dataset, but I cannot get the report results. image

Markovcom commented 3 years ago

------Data folder /data/datasets/ped2/testing/frames ------Model folder log/ped2/lr_0.00020_entropyloss_0.00020_version_0/ ------Restored ckpt log/ped2/lr_0.00020_entropyloss_0.00020_version_0/model-0079.pt data path: True AutoEncoderCov3DMem len data: 1830 The length of the reconstruction error is 1830 The length of the testing images is 1830 ............start to checking the anomaly detection auc score................... ............use ckpt dir at step 79 Number of gt frames: 1830 Number of predictions: 1830 AUC score on data ped2 is 0.85

lyn1874 commented 3 years ago

Hi, Thanks for your interest in this implementation.

I have downloaded the repo and trained the model for the UCSDped2 dataset. My accuracy varies between 0.90 to 0.94 (Although I am not sure if this is reasonable since the authors didn't report the confidence interval of their accuracies). Also, the last ckpt model-0079.ckpt is not always the best, maybe you could also try to evaluate the ckpts at other steps.

chenming1999 commented 3 years ago

How did you get this file? Avenue_gt.npy

Markovcom commented 3 years ago

![Uploading image.png…]()

chenming1999 commented 3 years ago

@Markovcom What did you send? I can't see it.

Markovcom commented 3 years ago

image @chenming1999

Markovcom commented 3 years ago

./ckpt/Avenue_gt.npy

chenming1999 commented 3 years ago

I know in this folder, is this file generated by program? @lyn1874

interstate50 commented 3 years ago

Hi, I trained the model with this repo but got some problems. When entropyloss_weight is 0 (lr_0.00010_entropyloss_0.00000_version_0), the model could got AUC 0.92 on UCSDped2. However, when entropyloss_weight is 0.0002 (lr_0.00010_entropyloss_0.00020_version_0), the model only got 0.7938 AUC. Could you check it why? And following the previous question (https://github.com/lyn1874/memAE/issues/4#issuecomment-742170152), it is similarly that the learning rate setting is also important (lr_0.00010_entropyloss_0.00020_version_0, AUC=0.85). @lyn1874

lyn1874 commented 3 years ago

Hey, @interstate50 thanks for your interest. As for your questions:

  1. Yes, I am also facing the same issue that when I use the entropyloss_weight in the loss function, the AUC score gets worse sometimes. I think this is because some of the hyperparameters in my case are not the best. For example, I didn't tune the training epochs, learning rate decay method, optimization method.
  2. Yes, learning rate setting is important not only here, but probably in the training of every neural network