huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
135.53k stars 27.13k forks source link

Add Mask2Former #18503

Open shivalikasingh95 opened 2 years ago

shivalikasingh95 commented 2 years ago

Model description

Mask2Former is a single architecture for panoptic, instance and semantic segmentation.

Mask2Former Paper Abstract: Image segmentation is about grouping pixels with different semantics, e.g., category or instance membership, where each choice of semantics defines a task. While only the semantics of each task differ, current research focuses on designing specialized architectures for each task. We present Masked-attention Mask Transformer (Mask2Former), a new architecture capable of addressing any image segmentation task (panoptic, instance or semantic). Its key components include masked attention, which extracts localized features by constraining cross-attention within predicted mask regions. In addition to reducing the research effort by at least three times, it outperforms the best specialized architectures by a significant margin on four popular datasets. Most notably, Mask2Former sets a new state-of-the-art for panoptic segmentation (57.8 PQ on COCO), instance segmentation (50.1 AP on COCO) and semantic segmentation (57.7 mIoU on ADE20K).

Open source status

Provide useful links for the implementation

Paper: https://arxiv.org/abs/2112.01527

Github repo (and weights): https://github.com/facebookresearch/Mask2Former

shivalikasingh95 commented 2 years ago

@NielsRogge I'd like to work on adding this model if no one is working on it yet?

NielsRogge commented 2 years ago

cc'ing @alaradirik, yes we're planning to add this model. If you're interested in it, feel free to get started with a draft PR. Note that we already have MaskFormer implemented, and I've heard Mask2Former only adds minor modifications.

Could you give me your email address, such that we can add you on Slack for easier communication?

shivalikasingh95 commented 2 years ago

Thanks @NielsRogge that would be great! You can use this email (shivalikasingh95@gmail.com) to add me on Slack!

I'll get started on a draft PR. But, I may need some guidance as this is my first time contributing to transformers. I'll get started by understanding the MaskFormer implementation.

shivalikasingh95 commented 2 years ago

Hi @NielsRogge just a gentle reminder to add me on slack :)

NielsRogge commented 2 years ago

Hi, I've pinged someone to add you.

ArthurOuaknine commented 2 years ago

Hello. Any updates about the Mask2Former integration? Thanks

shivalikasingh95 commented 2 years ago

Hi @ArthurOuaknine I'm working on it. Currently there is an open PR on my transformers fork. Will try to close this in next couple of days.

ArthurOuaknine commented 2 years ago

Thanks for your work, it will definitely help :)