BAAI-DCAI / Bunny

A family of lightweight multimodal models.
Apache License 2.0
865 stars 65 forks source link

quantisation modes #27

Closed nevilshah235 closed 5 months ago

nevilshah235 commented 5 months ago

Can the bunny models be loaded in 4bit or 8bit quantised modes?

nevilshah235 commented 5 months ago

Code

`import os

from bunny.model.builder import load_pretrained_model from bunny.util.mm_utils import get_model_name_from_path

model-path = 'bunny-phi-1.5-siglip-lora' model_base = 'phi-1_5' model_type = 'phi-1.5'

model_path = os.path.expanduser(model_path) model_name = get_model_name_from_path(model_path)

tokenizer, model, image_processor, context_len = load_pretrained_model(model_path, model_base, model_name, model_type)`

When I am trying to generate the merged lora weights for the model in 4bit and 8bit I am getting the following errors.

4bit model

Screenshot 2024-03-14 at 7 15 13 PM

8bit model

Screenshot 2024-03-14 at 7 17 43 PM