BradyFU / Awesome-Multimodal-Large-Language-Models

:sparkles::sparkles:Latest Advances on Multimodal Large Language Models
11.59k stars 750 forks source link

A new paper in NeurIPS 2023: Bootstrapping Vision-Language Learning with Decoupled Language Pre-training #91

Closed yiren-jian closed 9 months ago

yiren-jian commented 9 months ago

Hello,

Thanks for the great project. We have a new paper accepted at NeurIPS 2023: https://arxiv.org/abs/2307.07063, https://openreview.net/forum?id=8Kch0ILfQH, https://github.com/yiren-jian/BLIText

@inproceedings{
    jian2023bootstrapping,
    title={Bootstrapping Vision-Language Learning with Decoupled Language Pre-training},
    author = {Jian, Yiren and Gao, Chongyang and Vosoughi, Soroush},
    booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
    year={2023},
    url={https://openreview.net/forum?id=8Kch0ILfQH}
}

Would you consider adding it to the project?

Thank you.

xjtupanda commented 9 months ago

Thanks for sharing! We have added the paper to our repo. If you find our projects helpful to your research, please consider citing:

@article{yin2023survey,
  title={A Survey on Multimodal Large Language Models},
  author={Yin, Shukang and Fu, Chaoyou and Zhao, Sirui and Li, Ke and Sun, Xing and Xu, Tong and Chen, Enhong},
  journal={arXiv preprint arXiv:2306.13549},
  year={2023}
}

@article{fu2023mme,
  title={MME: A Comprehensive Evaluation Benchmark for Multimodal Large Language Models},
  author={Fu, Chaoyou and Chen, Peixian and Shen, Yunhang and Qin, Yulei and Zhang, Mengdan and Lin, Xu and Qiu, Zhenyu and Lin, Wei and Yang, Jinrui and Zheng, Xiawu and Li, Ke and Sun, Xing and Ji, Rongrong},
  journal={arXiv preprint arXiv:2306.13394},
  year={2023}
}

@article{yin2023woodpecker,
  title={Woodpecker: Hallucination Correction for Multimodal Large Language Models},
  author={Yin, Shukang and Fu, Chaoyou and Zhao, Sirui and Xu, Tong and Wang, Hao and Sui, Dianbo and Shen, Yunhang and Li, Ke and Sun, Xing and Chen, Enhong},
  journal={arXiv preprint arXiv:2310.16045},
  year={2023}
}