Closed xinyudong93 closed 3 weeks ago
I saved trained unsloth model in lora method:
model.save_pretrained_merged(adapter_path, tokenizer, save_method = "lora",)
Then use vllm.LLM to load it like:
llm = LLM( adapter_path tensor_parallel_size=tensor_parallel_size, enable_lora=True, max_lora_rank=64 )
it returned error:
ValueError: No supported config format found in unsloth_llama_3_1_pathology_epoch8_adapter
But if I save the merged model:
model.save_pretrained_merged(merged_model_path, tokenizer)
vllm can load it
@xinyudong93 Apologies on the delay - vLLM for LoRAs are done in https://docs.vllm.ai/en/latest/models/lora.html
I saved trained unsloth model in lora method:
model.save_pretrained_merged(adapter_path, tokenizer, save_method = "lora",)
Then use vllm.LLM to load it like:
llm = LLM( adapter_path tensor_parallel_size=tensor_parallel_size, enable_lora=True, max_lora_rank=64 )
it returned error:
ValueError: No supported config format found in unsloth_llama_3_1_pathology_epoch8_adapter
But if I save the merged model:
model.save_pretrained_merged(merged_model_path, tokenizer)
vllm can load it