henrryzh1 / UR-DMU

Official code for AAAI2023 paper "Dual Memory Units with Uncertainty Regulation for Weakly Supervised Video Anomaly Detection"
MIT License
62 stars 5 forks source link

About the Evaluation Metrics "AUC_sub and AP_sub" in paper #4

Closed YukiFan closed 1 year ago

YukiFan commented 1 year ago

Sorry to bother again. I wonder about How to acquire the AUC_sub/ AP_sub in your Table 6 of your paper. I use the checkpoint you released, and get "auc:0.9401785396804845 ap:0.8166405634057078" for AUC_sub and AP_sub, i save the predict_scores and gt_abnormal of all abnormal videos of test datasets, then same as the whole test process, but the result i got is "auc_ab = 0.5053128881292448, ap_ab = 0.41389405551736613" which is different from the result released in the paper. Thank you~

henrryzh1 commented 1 year ago

We use the following code to get the AUC_sub and AP_sub.

import numpy as np
import os
from sklearn.metrics import confusion_matrix, precision_recall_curve,roc_curve,auc

print("---------------------XD_violence--------------------")
with open("../list/XD_Test.list",encoding='utf8') as f:
    count=0
    lines=f.readlines()[0:2500]
    for i in range(500):
        num=np.load(lines[i*5].strip()).shape[0]
        count=count+num*16

predict=np.load("xd_frame_pre.npy")
gt=np.load("xd_gt.npy")
print(f'anomaly frames: {count}')
print("---------------- AUC,AP of XD_violence---------------")
fpr,tpr,thres=roc_curve(gt,predict)
auc_score=auc(fpr,tpr)
precision, recall, th=precision_recall_curve(gt,predict)
ap_score=auc(recall,precision)
print(ap_score)
print(auc_score)
P=predict[count:]
P[P>0.5]=1
P[P<=0.5]=0
GT=gt[count:]
conf_mat=confusion_matrix(GT,P,labels=[1,0])
print("---------------- FAR of XD_violence------------------")
print(conf_mat)
print(conf_mat[1][0]/len(P))
print("---------------- AUC,AP of XD_violence_sub-----------")
fpr,tpr,thres=roc_curve(gt[:count],predict[:count])
auc_score=auc(fpr,tpr)
print(auc_score)
precision, recall, th=precision_recall_curve(gt[:count],predict[:count])
ap_score=auc(recall,precision)
print(ap_score)

You could check it.

YukiFan commented 1 year ago

thanks for sharing the code, i have reproduced your result about "_sub" (^▽^)