InternLM / xtuner

An efficient, flexible and full-featured toolkit for fine-tuning LLM (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
https://xtuner.readthedocs.io/zh-cn/latest/
Apache License 2.0
3.93k stars 306 forks source link

shape mismatch when loading llava-phi path #681

Open shockjiang opened 5 months ago

shockjiang commented 5 months ago

I try to load pretrained pth of llava: hub/llava-phi-3-mini-pth/model.pth. And I got this strange error:

RuntimeError: Error(s) in loading state_dict for LLaVAModel:
    size mismatch for llm.model.embed_tokens.weight: copying a param with shape torch.Size([32064, 3072]) from checkpoint, the shape in current model is torch.Size([0]).
    size mismatch for llm.model.layers.0.self_attn.o_proj.weight: copying a param with shape torch.Size([3072, 3072]) from checkpoint, the shape in current model is torch.Size([0]).

any clue? thx!

pppppM commented 5 months ago

The issue might be due to the local model not being initialized correctly.

Before loading the checkpoint, check if the model contains the key llm.model.layers.0.self_attn.o_proj.weight.

hhaAndroid commented 5 months ago

@shockjiang Can you try this in https://github.com/InternLM/xtuner/blob/main/xtuner/configs/deepspeed/deepspeed_zero3.json ?

{"zero_optimization": {
  "stage3_prefetch_bucket_size":0}
}