huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
134.99k stars 27.01k forks source link

Add InternImage #22240

Open Weiyun1025 opened 1 year ago

Weiyun1025 commented 1 year ago

Model description

InternImage is a new large-scale CNN-based foundation model, which can obtain the gain from increasing parameters and training data like ViTs. Different from the recent CNNs that focus on large dense kernels, InternImage takes deformable convolution as the core operator, so that this model not only has the large effective receptive field required for downstream tasks such as detection and segmentation, but also has the adaptive spatial aggregation conditioned by input and task information. InternImage-H achieved a new record 65.4 mAP on COCO test-dev and 62.9 mIoU on ADE20K, outperforming current leading CNNs and ViTs.

It is worth noting that InternImage relies on a custom cuda operator, so if this causes problems for model addition, you can replace the cuda operator with a pytorch implementation.

In fact, we have already submitted a version of the code on transformers, however, due to security reasons, the code we submitted cannot call your web inference api, so we would like you to add InternImage to transformers.

Open source status

Provide useful links for the implementation

https://github.com/OpenGVLab/InternImage

souravpy commented 1 year ago

Can I take it up?

Weiyun1025 commented 1 year ago

Can I take it up?

Of course, thank you!

adit299 commented 1 year ago

@souravpy Are you currently working on this? If not, I would love to take a look to see if I could help in adding this model to HF Transformers!

amyeroberts commented 1 year ago

The modeling code and weights for Intern Image are already on the hub, and so the model can already be used directly with the AutoModel API.

cf. https://github.com/huggingface/transformers/pull/23782#issuecomment-1568459737