agemagician / ProtTrans

ProtTrans is providing state of the art pretrained language models for proteins. ProtTrans was trained on thousands of GPUs from Summit and hundreds of Google TPUs using Transformers Models.
Academic Free License v3.0
1.1k stars 152 forks source link

got an unexpected keyword argument 'gradient_checkpointing' #60

Closed yuyanislearning closed 2 years ago

yuyanislearning commented 2 years ago

I am using pytorch 1.9.0a and running on the ProtBert-BFD-FineTune-SS3.ipynb. I met an error on model init with an error message:

AutoModelForTokenClassification.from_pretrained(model_name, num_labels=len(unique_tags), id2label=id2tag, label2id=tag2id, gradient_checkpointing=False)

File "finetune_protbert.py", line 195, in model_init return AutoModelForTokenClassification.from_pretrained(model_name, File "/opt/conda/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 419, in from_pretrained return model_class.from_pretrained(pretrained_model_name_or_path, *model_args, config=config, *kwargs) File "/opt/conda/lib/python3.8/site-packages/transformers/modeling_utils.py", line 1402, in from_pretrained model = cls(config, model_args, **model_kwargs) TypeError: init() got an unexpected keyword argument 'gradient_checkpointing'

I wonder how could I solve the issue? Also, I wonder if there is any documentation on the pretrained model or if there is source code available? Thank you!

skr3178 commented 1 year ago

AutoModelForTokenClassification.from_pretrained(model_name, num_labels=len(unique_tags), id2label=id2tag, label2id=tag2id, gradient_checkpointing=False)

@yuyanislearning How did you solve the issue?

mheinzinger commented 1 year ago

Maybe this is due to a Transformer library version mismatch? --> https://github.com/HHousen/TransformerSum/issues/57