CVHub520 / X-AnyLabeling

Effortless data labeling with AI support from Segment Anything and other awesome models.
GNU General Public License v3.0
3.04k stars 345 forks source link

How to include confidence information in the annotation results. #399

Closed dejun219 closed 2 weeks ago

dejun219 commented 1 month ago

Dear author,

In the annotation results of this model, each bounding box does not include the corresponding predicted confidence information. Could you guide me on how to modify the relevant script so that the output results in the JSON file include confidence information?

Best regards,

CVHub520 commented 1 month ago

Hey there!

Right now, there isn't a built-in way to add output fields to label files, so you'll need to tweak the code a bit to make that happen. I suggest taking a closer look at the source code to figure out where you need to make those adjustments. Just make sure to give it a good test run after you've made your changes.

If you run into any snags, feel free to reach out.

Take care!

dejun219 commented 1 month ago

作者您好 详细了解后,发现您也是中国人可能,我这边详细在请教下您: 20240509-094224 我是想在标注的结果Json文件中增加这个框的置信度信息,进而辅助我这个数据我是否要用,或者标注的是否正确。 您昨天提到了修改源码,我进行了尝试但是未成功,您能否详细再介绍下修改哪一部分源码? 谢谢回复。

CVHub520 commented 1 month ago

您好,修改源码需要涉及几处改动:

  1. https://github.com/CVHub520/X-AnyLabeling/blob/main/anylabeling/views/labeling/shape.py
  2. 您的模型推理文件中的predict_shapes()
  3. https://github.com/CVHub520/X-AnyLabeling/blob/main/anylabeling/views/labeling/label_file.py

更具体地,我之前添加过一个difficult参数,你可以搜下对应的commit记录,看下增加字段涉及哪些地方。

CVHub520 commented 1 month ago

Hi @dejun219:

Just wanted to let you know that the feature request for displaying the confidence score has been implemented and pushed to the remote repository.

Now, you can now update your local source code to experience this new addition. Happy coding!

Best regards, CVHub

dejun219 commented 1 month ago

谢谢您,我这边更新下。