Koldim2001 / YOLO-Patch-Based-Inference

Python library for YOLO small object detection and instance segmentation
GNU Affero General Public License v3.0
273 stars 14 forks source link

Is there way to support yolo-obb model? #24

Open filot977 opened 2 months ago

filot977 commented 2 months ago

Dear authors.

Do you have any plan to support yolo-obb model?

Thank you.

Koldim2001 commented 2 months ago

Good afternoon. Development in this area is underway. Most likely, a separate library called patched_obb_infer from our development team will be released in the next month. So stay tuned for updates. I will let you know when the first release is available. The current estimate is for early November.

Alisoltan82 commented 3 weeks ago

Is there away to validate model output while using patch_inference as it usually boost the overall performance ?

previously I was able to do so with a tool like sahi inference after transforming annotations to coco annotations.

Koldim2001 commented 3 weeks ago

@Alisoltan82 Unfortunately, there isn't a ready-made tool to directly export patch inference results to COCO format. However, you can still obtain the results of the patch inference algorithm in a structured way, which you can then convert to any desired format, including COCO. Here's how you can access the results:

# Final Results:
img = result.image
confidences = result.filtered_confidences
boxes = result.filtered_boxes
polygons = result.filtered_polygons
classes_ids = result.filtered_classes_id
classes_names = result.filtered_classes_names

With these results, you can write the data into any format you need, including COCO. While we don't currently have a built-in tool for directly saving to COCO, we might consider adding such a feature in the future. For now, you can manually convert the results to COCO format using the data provided.

Alisoltan82 commented 3 weeks ago

Thanks