TencentARC / LLaMA-Pro

[ACL 2024] Progressive LLaMA with Block Expansion.
https://tencentarc.github.io/LLaMA-Pro/
Apache License 2.0
449 stars 34 forks source link

Should I freeze norm.weight? #7

Open metterian opened 6 months ago

metterian commented 6 months ago

Llama base model has norm.weight.

Did you also freeze norm.weight when post-training?

    "model.layers.9.self_attn.o_proj.weight": "model-00001-of-00004.safetensors",
    "model.layers.9.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
    "model.layers.9.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
    "model.norm.weight": "model-00004-of-00004.safetensors"
hills-code commented 5 months ago

We freeze all the weights of the initial llama model and only train the newly added blocks.