Closed Warcry25 closed 3 months ago
is it possible to obtained the results after training completed?
There may be too many objects in your dataset. You can save the results and recalculate metrics in identical ways: https://nbviewer.org/github/MiXaiLL76/faster_coco_eval/blob/main/examples/eval_example.ipynb https://nbviewer.org/github/MiXaiLL76/faster_coco_eval/blob/main/examples/curve_example.ipynb
I have figure out finally thank you anyways. Just added this into the config.py:
vis_backends = [ dict(type='LocalVisBackend'), dict(type='WandbVisBackend'), dict(type='TensorboardVisBackend')]
visualizer = dict( name='visualizer', type='DetLocalVisualizer', vis_backends=vis_backends)
val_evaluator = dict( metric='bbox', classwise=True, type='CocoMetric', format_only=False, backend_args=None, ann_file='data/coco/valid/_annotations.coco.json')
i want these evaluation for my custom train model but im unsure how to make this appear. im using cocometric. can someone help me asap pls:
Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.055 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.112 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.046 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.071 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.226 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.077 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.271 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.319 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.250 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.264 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.397