open-mmlab / playground

A central hub for gathering and showcasing amazing projects that extend OpenMMLab with SAM and other exciting features.
Apache License 2.0
1.1k stars 122 forks source link

Suggestion - Integrate MobileSAM into the pipeline for lightweight and faster inference #128

Open mdimtiazh opened 1 year ago

mdimtiazh commented 1 year ago

Reference: https://github.com/ChaoningZhang/MobileSAM

Our project performs on par with the original SAM and keeps exactly the same pipeline as the original SAM except for a change on the image encode, therefore, it is easy to Integrate into any project.

MobileSAM is around 60 times smaller and around 50 times faster than original SAM, and it is around 7 times smaller and around 5 times faster than the concurrent FastSAM. The comparison of the whole pipeline is summarzed as follows:

image

image

Best Wishes,

Qiao

YanxingLiu commented 1 year ago

We have supported mobile_sam and pr it to main branch. Note that the new version of label_anything need an additional param which is model_name. Detail usages are in readme_zh_CN.md. Refer to: https://github.com/open-mmlab/playground/pull/132. Thank you for your interest.