Closed ZahraGhasemi-AI closed 2 years ago
Hi, could you provide more detail? Could you share the stacktrace of the error for example?
I got this error at first:
...
File "/media/2TB_2/ZGH/DontDelet/venv3.6.9/lib/python3.6/site-packages/transformers/modeling_utils.py", line 625, in from_pretrained
pretrained_model_name_or_path,
OSError: Error no file named ['pytorch_model.bin', 'tf_model.h5', 'model.ckpt.index'] found in directory electra-small or from_tf
set to False
then I set from_tf
to True and I got another error:
...
File "/media/2TB_2/ZGH/DontDelet/venv3.6.9/lib/python3.6/site-packages/transformers/modeling_electra.py", line 100, in load_tf_weights_in_electra
assert pointer.shape == array.shape, original_name
AssertionError: ('electra/encoder/layer_0/attention/self/key/bias', torch.Size([252]), (256,))
and when I used bert-base-multilingual-cased(tf version) as a transformer, I got this: ... model = load_tf2_checkpoint_in_pytorch_model(model, resolved_archive_file, allow_missing_keys=True) File "/media/2TB_2/ZGH/DontDelet/venv3.6.9/lib/python3.6/site-packages/transformers/modeling_tf_pytorch_utils.py", line 252, in load_tf2_checkpoint_in_pytorch_model tf_model_class = getattr(transformers, tf_model_class_name) AttributeError: module 'transformers' has no attribute 'TFBertForMaskedLM'
but when I used bert-base-multilingual-cased(pytorch version), it run correctly, so the problem is about load tf1...
Hmm, these errors are coming from the transformers
package, not declutr
. The stack trace shows the following files:
/media/2TB_2/ZGH/DontDelet/venv3.6.9/lib/python3.6/site-packages/transformers/modeling_utils.py
/media/2TB_2/ZGH/DontDelet/venv3.6.9/lib/python3.6/site-packages/transformers/modeling_electra.py
/media/2TB_2/ZGH/DontDelet/venv3.6.9/lib/python3.6/site-packages/transformers/modeling_tf_pytorch_utils.py
What is the value of pretrained_model_name_or_path
? Can you try loading it outside of declutr
, e.g.
from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained(pretrained_model_name_or_path)
tokenizer = AutoTokenizer.from_pretrained(pretrained_model_name_or_path)
I tried it and got same error:
Traceback (most recent call last):
File..............., line 3, in
Right, so this suggests the problem is with transformers
, not declutr
. You may have to ask for help on the transformers
repo!
Hi. I want to load a pretrained model that trained with fensorflow 1.15. Is there any solution for it? I found some solutions but all of them was used for converting tensorflow 2 to pytorch.bin. I used tensorflow1.15.