Closed jui0616 closed 2 months ago
model = AutoModelForCausalLM.from_pretrained( './check/BAAI/Bunny-v1___0-3B', # local path to Bunny-v1.0-3B torch_dtype=torch.float16, # float32 for cpu device_map='auto', trust_remote_code=True)
+1
Close the issue for now if there's no further discussions. Feel free to reopen it if there's any other questions.
When I load the pre trained model locally, KeyError: 'bunny-phi' occurs
`model = AutoModelForCausalLM.from_pretrained( model_name = 'BAAI/Bunny-v1_0-3B', torch_dtype=torch.float16, # float32 for cpu device_map='auto', trust_remote_code=True, pretrained_model_name_or_path="./check/BAAI/Bunny-v1_0-3B").to(device) tokenizer = AutoTokenizer.from_pretrained( pretrained_model_name_orpath="./check/BAAI/Bunny-v10-3B", ).to(device)
`
pretrained_model_name_or_path="./check/BAAI/Bunny-v1___0-3B").to(device) pretrained_model_name_or_path, return_unused_kwargs=True, **kwargs config_class = CONFIG_MAPPING[config_dict["model_type"]]