OpenBMB / MiniCPM-V

MiniCPM-V 2.6: A GPT-4V Level MLLM for Single Image, Multi Image and Video on Your Phone
Apache License 2.0
12.77k stars 893 forks source link

请问OmniLMM-12B能8bit量化吗? #86

Closed jiayev closed 6 months ago

jiayev commented 6 months ago

个人研究使用24GB的4090,在实际环境中非常容易爆显存。

iceflame89 commented 6 months ago

你好,OmniLMM-12B暂没有做int8量化,我们新发布了更强大的MiniCPM-Llama3-V 2.5 ,总参数量8.5B,性能比肩GPT-4V,且有int4量化版本,欢迎试用~