Closed yyliu01 closed 1 year ago
Hi,
I've just updated the README page. You can find the weights there.
Matej
Hi Matej,
Thanks for your checkpoint! @matejgrcic
The evaluation results based on your checkpoint are in below:
SMIYC-RoadAnomalyTrack: AUROC score for segment_me_anomaly: 0.8927 AUPRC score for segment_me_anomaly: 0.6362 FPR@TPR95 for segment_me_anomaly: 0.5071
SMIYC-ObstacleTrack: AUROC score for segment_me_obstacle: 0.9984 AUPRC score for segment_me_obstacle: 0.8769 FPR@TPR95 for segment_me_obstacle: 0.0058
The results seem far away from the test results in your paper.
Please note: I've utilised the measurement for anomaly objects based on the code from here.
Is there anything I'm missing?
Cheers, Yuyuan
That's strange. Did you evaluate using the official code: https://github.com/SegmentMeIfYouCan/road-anomaly-benchmark I've uploaded my script for submission if it helps.
I get: AP AnomalyTrack val: 86.5% AP ObstacleTrack val: 94.8%
Hi Matej,
I've found the difference and figure it out. Thanks for your time.
Cheers, Yuyuan
For anyone who may have this concerns, the differences are in mean and std of the image, as shown below:
C.image_mean = numpy.array([0, 0, 0])
C.image_std = numpy.array([1, 1, 1])
Hi Matej,
Could I ask for the checkpoint in SMIYC dataset?
Cheers, Yuyuan