invictus717 / MiCo

Explore the Limits of Omni-modal Pretraining at Scale
https://invictus717.github.io/MiCo/
Apache License 2.0
80 stars 4 forks source link

Will you make the pre-training code and training dataset public? #2

Open handsomelys opened 3 months ago

handsomelys commented 3 months ago

I admire and am interested in your work and would like to follow up on your work. Will you make the pre-training code and training dataset public?

invictus717 commented 3 months ago

Yes! We promise all these data and code will be publicly available. We are hard-working cleaning up code and preparing documents for pretraining, fine-tuning, and instruction tuning with LLMs, while the only problem is such a huge engineering task. We plan to release a series of pretrained multimodal LLMs in these several days.

handsomelys commented 3 months ago

Yes! We promise all these data and code will be publicly available. We are hard-working cleaning up code and preparing documents for pretraining, fine-tuning, and instruction tuning with LLMs, while the only problem is such a huge engineering task. We plan to release a series of pretrained multimodal LLMs in these several days.

This is so exciting! I’m really looking forward to your follow-up work!