princeton-nlp / LLM-Shearing

[ICLR 2024] Sheared LLaMA: Accelerating Language Model Pre-training via Structured Pruning
https://arxiv.org/abs/2310.06694
MIT License
533 stars 39 forks source link

KeyError: 'state' #49

Open changheecho opened 7 months ago

changheecho commented 7 months ago

Hello, Thank you for your study about pruning.

When I converted a pruned model to HuggingFace model, I have an error about key.

MODEL_PATH='models/Llama-2-7b-composer/pruned-state_dict.pt'
OUTPUT_PATH='models/Llama-2-7b-composer/hf-state_dict'
MODEL_CLASS='LlamaForCausalLM'
HIDDEN_SIZE=2048
NUM_ATTENTION_HEADS=16
NUM_HIDDEN_LAYERS=24
INTERMEDIATE_SIZE=5504
MODEL_NAME='Sheared-Llama-1.3B'

!python3 -m llmshearing.utils.composer_to_hf $MODEL_PATH $OUTPUT_PATH \
model_class=$MODEL_CLASS \
hidden_size=$HIDDEN_SIZE \
num_attention_heads=$NUM_ATTENTION_HEADS \
num_hidden_layers=$NUM_HIDDEN_LAYERS \
intermediate_size=$INTERMEDIATE_SIZE \
num_key_value_heads=$NUM_ATTENTION_HEADS \
_name_or_path=$MODEL_NAME

│ /tf/LLM-Shearing/llmshearing/utils/composer_to_hf.py:108 in <module>         │
│                                                                              │
│   105 if __name__ == "__main__":                                             │
│   106 │   composer_model_path, output_path, other_args = sys.argv[1], sys.ar │
│   107 │   cli_cfg = om.from_cli(other_args)                                  │
│ ❱ 108 │   save_composer_to_hf(composer_model_path, output_path, cli_cfg)     │
│   109 │   #save_hf_to_composer(composer_model_path, output_path)             │
│   110                                                                        │
│                                                                              │
│ /tf/LLM-Shearing/llmshearing/utils/composer_to_hf.py:90 in                   │
│ save_composer_to_hf                                                          │
│                                                                              │
│    87 def save_composer_to_hf(composer_model_path, output_path=None, model_c │
│    88 │   """ convert composer ckpt's weights to huggingface """             │
│    89 │                                                                      │
│ ❱  90 │   weights = torch.load(composer_model_path)["state"]["model"]        │
│    91 │   num_layers = get_layer_num_from_weights(weights)                   │
│    92 │   keymap = get_key_map_from_composer_to_hf(num_layers)               │
│    93 │   hf_weights = {keymap[key]: weights[key] for key in weights if "rot │
╰──────────────────────────────────────────────────────────────────────────────╯
KeyError: 'state'

Please what can I do to fix this error?

Thank you.

xiamengzhou commented 7 months ago

It seems that the model you saved does not have a "state" key inside, likely that the saving has failed. Could you check what is contained in the checkpoint?

Actually the code was buggy! Updated it :)

changheecho commented 6 months ago

Thank you so much. I will shear my model again. :)