huggingface / peft

🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
https://huggingface.co/docs/peft
Apache License 2.0
16.53k stars 1.64k forks source link

Unable to merge lora into base model properly? #2209

Closed hgftrdw45ud67is8o89 closed 2 weeks ago

hgftrdw45ud67is8o89 commented 2 weeks ago

System Info

py3.11

Who can help?

@BenjaminBossan @sayakpaul

Information

Tasks

Reproduction

        from transformers import AutoModelForCausalLM
    from peft import PeftModel
    base_model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen2.5-7B")
    model = PeftModel.from_pretrained(base_model, loraname)
    model = model.merge_and_unload()#untested prob should be in training code direct save as a new mdl
    model.save_pretrained(NEW_MODELBASENAME+"mergedwbase_bf16", tokenizer)
    tokenizer.save_pretrained(NEW_MODELBASENAME+"mergedwbase_bf16", legacy_format=False)
    print("done")

Expected behavior

I expect to have a folder named xxxx_mergedwbase_bf16 but it's not there

It also have not load the model or do any loading,i do not see progress bar>python xx.py

2024-11-10 23:03:13.070822: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2024-11-10 23:03:13.984027: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.

Anyone have any idea,or stuff that I need to take note,like compatibility or something?

githubnemo commented 2 weeks ago

Hey :)

Can you tell us a bit more about in which environment you are running this? The code looks fine (except that you're passing tokenizer as positional argument to save_pretrained which doesn't make sense).

hgftrdw45ud67is8o89 commented 2 weeks ago

hug thanks my friend,it worked after removing the tokenizer and tweaking a bit,I honestly don't know how But I will save the script for future use.