BAAI-DCAI / Bunny

A family of lightweight multimodal models.
Apache License 2.0
921 stars 68 forks source link

KeyError: 'bunny-phi' #105

Closed jui0616 closed 2 months ago

jui0616 commented 3 months ago

When I load the pre trained model locally, KeyError: 'bunny-phi' occurs

`model = AutoModelForCausalLM.from_pretrained( model_name = 'BAAI/Bunny-v1_0-3B', torch_dtype=torch.float16, # float32 for cpu device_map='auto', trust_remote_code=True, pretrained_model_name_or_path="./check/BAAI/Bunny-v1_0-3B").to(device) tokenizer = AutoTokenizer.from_pretrained( pretrained_model_name_orpath="./check/BAAI/Bunny-v10-3B", ).to(device)

`

pretrained_model_name_or_path="./check/BAAI/Bunny-v1___0-3B").to(device) pretrained_model_name_or_path, return_unused_kwargs=True, **kwargs config_class = CONFIG_MAPPING[config_dict["model_type"]]

Isaachhh commented 3 months ago

model = AutoModelForCausalLM.from_pretrained( './check/BAAI/Bunny-v1___0-3B', # local path to Bunny-v1.0-3B torch_dtype=torch.float16, # float32 for cpu device_map='auto', trust_remote_code=True)

dingtine commented 3 months ago

+1

Isaachhh commented 2 months ago

Close the issue for now if there's no further discussions. Feel free to reopen it if there's any other questions.