bytedance / coconut_cvpr2024

Apache License 2.0
129 stars 4 forks source link

[CVPR2024] 🥥COCONut: Crafting the Future of Segmentation Datasets with Exquisite Annotations in the Era of ✨Big Data✨

Xueqing Deng, Qihang Yu, Peng Wang, Xiaohui Shen, Liang-Chieh Chen

Dataset Website paper Full Paper

🚀 Contributions

🔥 1st large-scale human verified dataset for segmentation, more info can be found at our website.

🔥 COCONut is now available at Kaggle and huggingface, welcome to download!

teaser

📢 News!

TODO

Dataset Splits

Splits #images #masks images kaggle huggingface
COCONut-S 118K 1.54M download download preview
COCONut-B 242K 2.78M download download preview
COCONut-L 358K 4.75M [coming]() [coming]() [coming]()
relabeled-COCO-val 5K 67K download download preview
COCONut-val 25K 437K [coming]() download [coming]()

Get Started

We only provide the annotation, for those who are interested to use our annotation will need to download the images from the links: COCONut-S images, COCONut-B images and relabeled COCO-val images.

We provide two methods to download the dataset annotations, details are as below。

🔗Kaggle download link

You can use the web UI to download the dataset directly on Kaggle.

If you find our dataset useful, we really appreciate if you can upvote our dataset on Kaggle,

🔗Huggingface dataset preview

Directly download the data from huggingface or git clone the huggingface dataset repo will result in invalid data structure.

We recommend you to use our provided download script to download the dataset from huggingface.

pip install datasets tqdm
python download_coconut.py # default split: relabeled_coco_val

You can switch to download COCONut-S by adding "--split coconut_s" to the command.

python download_coconut.py --split coconut_s

The data will be saved at "./coconut_datasets" by default, you can change it to your preferred path by adding "--output_dir YOUR_DATA_PATH".

Tutorials

FAQ

We summarize the common issues in FAQ.md, please check this out before you create any new issues.

More visualization on COCONut annotation

vis1

vis2

Terms of use

We follow the same license as COCO dataset for images. For COCONut's annotations, non-commercial use are allowed.

Acknowledgement

Bibtex

If you find our dataset useful, please cite:

@inproceedings{coconut2024cvpr,
  author    = {Xueqing Deng, Qihang Yu, Peng Wang, Xiaohui Shen, Liang-Chieh Chen},
  title     = {COCONut: Modernizing COCO Segmentation},
  booktitle   = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year      = {2024},