Open coppock opened 1 month ago
The failing tensor is lm_head.weight. The following patch fixes this issue:
diff --git a/examples/gptj/convert_checkpoint.py b/examples/gptj/convert_checkpoint.py
index 8c062bc4..f00f8f33 100644
--- a/examples/gptj/convert_checkpoint.py
+++ b/examples/gptj/convert_checkpoint.py
@@ -249,7 +249,7 @@ def convert_hf_gptj(hf_model: GPTJForCausalLM,
weights['lm_head.weight'] = split_matrix(lm_head_w,
mapping.tp_size,
mapping.tp_rank,
- dim=0)
+ dim=0).contiguous()
weights['lm_head.bias'] = split_matrix(ln_head_bias,
mapping.tp_size,
mapping.tp_rank,
System Info
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Expected behavior
A successful build
actual behavior
Checkpoint conversion fails with error "You are trying to save a noncontiguous tensor...."
additional notes
Conversion of Llama weights succeeds without error.