Closed yuyanislearning closed 2 years ago
AutoModelForTokenClassification.from_pretrained(model_name, num_labels=len(unique_tags), id2label=id2tag, label2id=tag2id, gradient_checkpointing=False)
@yuyanislearning How did you solve the issue?
Maybe this is due to a Transformer library version mismatch? --> https://github.com/HHousen/TransformerSum/issues/57
I am using pytorch 1.9.0a and running on the ProtBert-BFD-FineTune-SS3.ipynb. I met an error on model init with an error message:
AutoModelForTokenClassification.from_pretrained(model_name, num_labels=len(unique_tags), id2label=id2tag, label2id=tag2id, gradient_checkpointing=False)
File "finetune_protbert.py", line 195, in model_init return AutoModelForTokenClassification.from_pretrained(model_name, File "/opt/conda/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 419, in from_pretrained return model_class.from_pretrained(pretrained_model_name_or_path, *model_args, config=config, *kwargs) File "/opt/conda/lib/python3.8/site-packages/transformers/modeling_utils.py", line 1402, in from_pretrained model = cls(config, model_args, **model_kwargs) TypeError: init() got an unexpected keyword argument 'gradient_checkpointing'
I wonder how could I solve the issue? Also, I wonder if there is any documentation on the pretrained model or if there is source code available? Thank you!