matejgrcic / DenseHybrid

Official implementation of paper "DenseHybrid: Hybrid Anomaly Detection for Dense Open-set Recognition"
GNU General Public License v2.0
36 stars 2 forks source link

The checkpoint for SMIYC test #5

Closed yyliu01 closed 1 year ago

yyliu01 commented 1 year ago

Hi Matej,

Could I ask for the checkpoint in SMIYC dataset?

Cheers, Yuyuan

matejgrcic commented 1 year ago

Hi,

I've just updated the README page. You can find the weights there.

Matej

yyliu01 commented 1 year ago

Hi Matej,

Thanks for your checkpoint! @matejgrcic

The evaluation results based on your checkpoint are in below:

SMIYC-RoadAnomalyTrack: AUROC score for segment_me_anomaly: 0.8927 AUPRC score for segment_me_anomaly: 0.6362 FPR@TPR95 for segment_me_anomaly: 0.5071

SMIYC-ObstacleTrack: AUROC score for segment_me_obstacle: 0.9984 AUPRC score for segment_me_obstacle: 0.8769 FPR@TPR95 for segment_me_obstacle: 0.0058

The results seem far away from the test results in your paper.

Please note: I've utilised the measurement for anomaly objects based on the code from here.

Is there anything I'm missing?

Cheers, Yuyuan

matejgrcic commented 1 year ago

That's strange. Did you evaluate using the official code: https://github.com/SegmentMeIfYouCan/road-anomaly-benchmark I've uploaded my script for submission if it helps.

I get: AP AnomalyTrack val: 86.5% AP ObstacleTrack val: 94.8%

yyliu01 commented 1 year ago

Hi Matej,

I've found the difference and figure it out. Thanks for your time.

Cheers, Yuyuan

yyliu01 commented 1 year ago

For anyone who may have this concerns, the differences are in mean and std of the image, as shown below:

C.image_mean = numpy.array([0, 0, 0])
C.image_std = numpy.array([1, 1, 1])