rafaelpadilla / review_object_detection_metrics

Object Detection Metrics. 14 object detection metrics: mean Average Precision (mAP), Average Recall (AR), Spatio-Temporal Tube Average Precision (STT-AP). This project supports different bounding box formats as in COCO, PASCAL, Imagenet, etc.
Other
1.07k stars 213 forks source link

Packaging as pip modules #7

Closed SJBertram closed 3 years ago

SJBertram commented 3 years ago

As per rafaelpadilla/Object-Detection-Metrics#58, it would be great if this could be published as a pip module on PyPi so that people had a consistent, reliable, trustworthy way to generate metrics for their object detection results. It looks like it is already mostly compatible as there is a skeleton setup.py, so hopefully it isn't too difficult. I don't know what restructuring may be required to avoid naming collisions with other packages.

While the repo is now called "review_object_detection_metrics", I think import object_detection_metrics is still a sensible name for the package.

I don't know whether @andreydung or @malpunek would still be interested in helping with this new version.

rafaelpadilla commented 3 years ago

@SJBertram that's a good idea.

Some points we need to be aware:

What are your suggestions?

Regarding the other repository, I will answer it on the issue https://github.com/rafaelpadilla/Object-Detection-Metrics/issues/58

potipot commented 3 years ago

This code is great, much cleaner and understandable than the original COCOMetric. I like simple API plug-in of

def get_coco_summary(groundtruth_bbs, detected_bbs):
[...]

The IceVision library could definitely use it. Did you run some efficiency comparison with the original metric? How much faster/slower is this lib?

rafaelpadilla commented 3 years ago

This code is great, much cleaner and understandable than the original COCOMetric. I like simple API plug-in of

def get_coco_summary(groundtruth_bbs, detected_bbs):
[...]

The IceVision library could definitely use it. Did you run some efficiency comparison with the original metric? How much faster/slower is this lib?

Please, leave this issue to be discussed with its original purpose. I kindly ask you to open another new issue including your comments.

pwoolvett commented 3 years ago

@rafaelpadilla I'd be glad to help you with the packaging process

With respect to the version/releas on pypi, common consensus is to use semver and keep the major below 1, eg 0.x.y. This way, we have the best of both worlds.

I can help setting up testing/release/pub for the project, but we need to discuss the details before...

SJBertram commented 3 years ago

I've never released on PyPi, but my understanding is that having the setup.py already gets you a step closer than the previous project.

With regards license, you've already got a MIT-like license so people are free to use it as long as they accept the terms. Although as it's modified then I don't think it's technically Open Source Compliant and compatible with other licenses. A plain MIT would allow anyone to use it without legal departments having to check and approve the license terms.

Looking at your additional clauses (I am not a lawyer, but) I think:

I like the idea of keeping versions below 1.0 to start with so that people know it is potentially unstable and prone to API change, while still allowing them to try it out if they want to.

rafaelpadilla commented 3 years ago

Hi @SJBertram ,

Thank you for your clear response :)

Regarding the number 2. I think this is not a problem, once I do not intend to move this repository to any other account. And it does not matter if the tool points to a non existing repository. The user will still be able to use the tool even if the repository name is changed, right?

In the next point, you mentioned that "the license terms of MIT won't force them to credit you in their paper (because the copyright only extends to the source code, not its outputs)" is clear. I believe in the academic world, there is no legal way to "force" someone to make reference to your work. But I believe that researchers using this tool have no reason not to cite it in their publications. To reinforce it, I added a splash screen that pops up when the user opens the tool asking the users to make reference to it. The reason for the citations is that this tool is part of a project that received financial aids. And one way the funding institution evaluates the success of the financial aid is by tracking the amount of citations.

To avoid misinterpretations of the license, I removed the Latex format, and changed it a little bit.

@pwoolvett, I opened the discussion tab. We could continue this conversation there. As you are more experienced in creating the pip package. Please, let me know what the required steps are.