Closed nengelmann closed 10 months ago
Definitely agree with this @nengelmann! I'm doing the first step towards this with the new streaming interface which will return detections consumable by supervision (hopefully getting released today) and will follow up shortly updating the detections that come back from our other model APIs.
Resolved by https://github.com/roboflow/inference/pull/147. Model infer()
methods now return InferenceModelResponse
objects, similar in structure to the JSON returned by the Roboflow inference API.
Search before asking
Description
The API (http) inference and the library (pip install) inference models have different result structures. It would be neat if they both return results with the structure of the API inference. This way the approaches are exchangeable and, more important, the results of the library inference can be used with supervision, as the API inference results.
Use case
Let's take the quickstart example as usecase. If you run the examples (ready to run scripts further down below), the following results are returned.
The API inference result:
The library inference result:
The API result can be easily processed with supervision, while the library result needs a different logic. For a bounding box the logic is fairly simple but for polygons (instance segmentation) not necessarily. Also, it would be a lot more convenient to have the library results also integrated with supervision.
Please let me know if the library results can be converted into a supervision compatible format. I just couldn't find anything. From the documentation, it was not clear to me how to reformat the instance segmentation results for usage with supervision.
Thanks for the step of open sourcing 'inference'! If I can help with the implementation, I'm happy to help, but might need some guidance/feedback.
Additional
Examples ready to run
API Example
Library Example
Are you willing to submit a PR?