error info:
ValueError: The checkpoint you are trying to load has model type bunny-qwen but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
in the config.json:
"model_type": "bunny-qwen"
After merge the lora with base model, model inference with above error, how to fix it.
thanks
error info: ValueError: The checkpoint you are trying to load has model type
bunny-qwen
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.in the config.json: "model_type": "bunny-qwen"
After merge the lora with base model, model inference with above error, how to fix it. thanks