ModelTC / llmc

[EMNLP 2024 Industry Track] This is the official PyTorch implementation of "LLMC: Benchmarking Large Language Model Quantization with a Versatile Compression Toolkit".
https://arxiv.org/abs/2405.06001
Apache License 2.0
326 stars 34 forks source link

Support for VLM: LLaVA, InterVL2, LLaMA 3.2, Qwen2VL. #183

Closed SmudgedWings closed 1 week ago

SmudgedWings commented 1 week ago

Support for VLM: LLaVA, InterVL2, LLaMA 3.2, Qwen2VL. The image-text calibration dataset loading method is more accurate and convenient. LLaVA, InterVL2, and Qwen2VL support mixed dataset calibration (pure image-text and single-text mixed calibration datasets).