Hi Team! I am new to huggingface, and I came across an error when I tried loading the model from the local disk. Specifically, I used git-lfs and git clone https://huggingface.co/alchemab/antiberta2-cssp, and then used model = ReformerModel.from_pretrained("/mnt/d/unifiedBCR/antiberta2-cssp") since I only want to obtain the embeddings.
However, an error was raised: You are using a model of type roformer to instantiate a model of type reformer. This is not supported for all configurations of models and can yield errors. Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/yipingzou2/.local/lib/python3.8/site-packages/transformers/modeling_utils.py", line 3832, in from_pretrained model = cls(config, *model_args, **model_kwargs) File "/home/yipingzou2/.local/lib/python3.8/site-packages/transformers/models/reformer/modeling_reformer.py", line 1982, in __init__ self.embeddings = ReformerEmbeddings(config) File "/home/yipingzou2/.local/lib/python3.8/site-packages/transformers/models/reformer/modeling_reformer.py", line 229, in __init__ AxialPositionEmbeddings(config) if config.axial_pos_embds else PositionEmbeddings(config) File "/home/yipingzou2/.local/lib/python3.8/site-packages/transformers/models/reformer/modeling_reformer.py", line 124, in __init__ raise ValueError( ValueError: Make sure that config.axial_pos_embds factors: (64, 192) sum to config.hidden_size: 1024
How can I fix the error if I want to use the embeddings when I have to load the pre-trained model from the local disk? Thanks!
Hi Team! I am new to huggingface, and I came across an error when I tried loading the model from the local disk. Specifically, I used git-lfs and
git clone https://huggingface.co/alchemab/antiberta2-cssp
, and then usedmodel = ReformerModel.from_pretrained("/mnt/d/unifiedBCR/antiberta2-cssp")
since I only want to obtain the embeddings.However, an error was raised:
You are using a model of type roformer to instantiate a model of type reformer. This is not supported for all configurations of models and can yield errors. Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/yipingzou2/.local/lib/python3.8/site-packages/transformers/modeling_utils.py", line 3832, in from_pretrained model = cls(config, *model_args, **model_kwargs) File "/home/yipingzou2/.local/lib/python3.8/site-packages/transformers/models/reformer/modeling_reformer.py", line 1982, in __init__ self.embeddings = ReformerEmbeddings(config) File "/home/yipingzou2/.local/lib/python3.8/site-packages/transformers/models/reformer/modeling_reformer.py", line 229, in __init__ AxialPositionEmbeddings(config) if config.axial_pos_embds else PositionEmbeddings(config) File "/home/yipingzou2/.local/lib/python3.8/site-packages/transformers/models/reformer/modeling_reformer.py", line 124, in __init__ raise ValueError( ValueError: Make sure that config.axial_pos_embds factors: (64, 192) sum to config.hidden_size: 1024
How can I fix the error if I want to use the embeddings when I have to load the pre-trained model from the local disk? Thanks!