Open simoRancati opened 3 months ago
Got the same error for a while. The issue is the version of transformers.
This code worked for me:
!pip install transformers==4.19.2
from transformers import AutoModelForCausalLM, AutoTokenizer
from transformers import pipeline
model = AutoModelForCausalLM.from_pretrained("lightonai/RITA_s", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("lightonai/RITA_s")
rita_gen = pipeline('text-generation', model=model, tokenizer=tokenizer)
sequences = rita_gen("MAB", max_length=20, do_sample=True, top_k=950, repetition_penalty=1.2,
num_return_sequences=2, eos_token_id=2)
for seq in sequences:
print(f"seq: {seq['generated_text'].replace(' ', '')}")
The following closed issue also describes this exact problem.
Hope this helps.
Dear Authors, When I try to run your example.py code to generate protein sequences I get this error :
Traceback (most recent call last): File "/Users/utente/Desktop/Varcovid/GenSeq/RITA.py", line 10, in
sequences = rita_gen("MAB", max_length=20, do_sample=True, top_k=950, repetition_penalty=1.2,num_return_sequences=2, eos_token_id=2, truncation=True)
File "/Users/utente/Library/Python/3.9/lib/python/site-packages/transformers/pipelines/text_generation.py", line 240, in call
return super().call(text_inputs, kwargs)
File "/Users/utente/Library/Python/3.9/lib/python/site-packages/transformers/pipelines/base.py", line 1206, in call
return self.run_single(inputs, preprocess_params, forward_params, postprocess_params)
File "/Users/utente/Library/Python/3.9/lib/python/site-packages/transformers/pipelines/base.py", line 1213, in run_single
model_outputs = self.forward(model_inputs, forward_params)
File "/Users/utente/Library/Python/3.9/lib/python/site-packages/transformers/pipelines/base.py", line 1112, in forward
model_outputs = self._forward(model_inputs, forward_params)
File "/Users/utente/Library/Python/3.9/lib/python/site-packages/transformers/pipelines/text_generation.py", line 327, in _forward
generated_sequence = self.model.generate(input_ids=input_ids, attention_mask=attention_mask, generate_kwargs)
File "/Users/utente/Library/Python/3.9/lib/python/site-packages/torch/autograd/grad_mode.py", line 28, in decorate_context
return func(*args, **kwargs)
File "/Users/utente/Library/Python/3.9/lib/python/site-packages/transformers/generation/utils.py", line 1323, in generate
self._validate_model_class()
File "/Users/utente/Library/Python/3.9/lib/python/site-packages/transformers/generation/utils.py", line 1064, in _validate_model_class
raise TypeError(exception_message)
TypeError: The current model class (RITAModelForCausalLM) is not compatible with
.generate()
, as it doesn't have a language model head. Please use one of the following classes instead: {'RITAModelForCausalLM'}What can I do to solve it ?