EleutherAI / math-lm

MIT License
1.03k stars 78 forks source link

exporting to Hugging Face? #87

Closed aojunzz closed 11 months ago

aojunzz commented 11 months ago

I try to convert the NeoX checkpoint to HF format using convert_sequential_to_hf.py, the config is https://github.com/EleutherAI/gpt-neox/blob/6bc724b90895fb7de7d324a385e5ca6992d54e9e/configs/llemma_7b.yml,

But, I encountered the error: KeyError: 'sequential.2.mlp.dense_4h_to_h.weight'

haileyschoelkopf commented 11 months ago

Hi, thanks for raising an issue!

It looks like I did not check in our updated-for-llama conversion scripts to our submodule branch--I will go do this.