Closed tsujuifu closed 2 years ago
Hi,
thx for the kind words. Can you check what HF version (transformers) you have installed? Make sure to use the one specified in the requirements. If you already had HF installed in your environment you might have the wrong version.
Best, Constantin
@tsujuifu I think this is related due to this modification from the finetuneanon
forked Transformers repo
https://github.com/finetuneanon/transformers#gpt-j-6b (they re-sized it to 54.000 vocab). So pip3 show transformers
should point to the finetuneanon
Transformers fork, instead of the upstream repo in your environment :)
That is the point! We should use finetuneanon/transformers, and it works well now.
Appreciate the kind reply 😍
Thanks for this wonderful work 😍
I load mp_rank_00_model_states.pt, but it shows that the shape of LM is different:
I guess it is because of the resize_token_embeddings here.
I also tried to truncate the additional dimension,
but the result of example_inference.py seems weird 😂
Super thanks for the help!