AGI-Edgerunners / LLM-Adapters

Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
https://arxiv.org/abs/2304.01933
Apache License 2.0
1.01k stars 91 forks source link

请问为如何两次加载不同的微调后生成的lora权重? #54

Open jinlong7790 opened 5 months ago

jinlong7790 commented 5 months ago
    model = LlamaForCausalLM.from_pretrained(
        base_model,
        load_in_8bit=load_8bit,
        torch_dtype=torch.float16,
        device_map="auto",
        trust_remote_code=True,
    )
    model = PeftModel.from_pretrained(
        model,
        lora_weights,
        torch_dtype=torch.float16,
    )
    model = PeftModel.from_pretrained(model, "/home/zzu_zxw/zjl_data/KnowPAT/save4")
HZQ950419 commented 5 months ago

Hi, 只需要加载一次就可以了,这里的第二次加载的代码是在哪个文件呢?