thunlp / OpenPrompt

An Open-Source Framework for Prompt-Learning.
https://thunlp.github.io/OpenPrompt/
Apache License 2.0
4.38k stars 455 forks source link

ValueError: The following `model_kwargs` are not used by the model #219

Closed Vimos closed 1 year ago

Vimos commented 1 year ago

Error reported when trying 6.1_chinese_dataset_uer_t5.py on transformers==4.24.0.

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
/tmp/ipykernel_34997/2417744918.py in <module>
     64 
     65         if actual_step % gradient_accumulation_steps == 0 and glb_step > 0 and glb_step % args.eval_every_steps == 0:
---> 66             val_acc = evaluate(prompt_model, validation_dataloader)
     67             if val_acc >= best_val_acc:
     68                 # torch.save(prompt_model.state_dict(),f"{args.project_root}/../ckpts/{this_run_unicode}.ckpt")

/tmp/ipykernel_34997/2191282827.py in evaluate(prompt_model, dataloader)
     24         if use_cuda:
     25             inputs = inputs.cuda()
---> 26         _, output_sentence = prompt_model.generate(inputs, verbose=False, **generation_arguments)
     27         predictions.extend(output_sentence)
     28         ground_truths.extend(inputs['tgt_text'])

~/anaconda3/lib/python3.9/site-packages/openprompt/pipeline_base.py in generate(self, batch, verbose, **generation_kwargs)
    497             self.generate_ith_token = 0
    498             self.in_generation_function = True
--> 499             output_sequences = super().generate(**batch, **input_generation_kwargs, pad_token_id=self.tokenizer.pad_token_id, eos_token_id=self.tokenizer.eos_token_id)
    500             self.in_generation_function = False
    501             output_sequences = output_sequences.cpu().tolist()

~/anaconda3/lib/python3.9/site-packages/torch/autograd/grad_mode.py in decorate_context(*args, **kwargs)
     25         def decorate_context(*args, **kwargs):
     26             with self.clone():
---> 27                 return func(*args, **kwargs)
     28         return cast(F, decorate_context)
     29 

~/anaconda3/lib/python3.9/site-packages/transformers/generation_utils.py in generate(self, inputs, max_length, min_length, do_sample, early_stopping, num_beams, temperature, penalty_alpha, top_k, top_p, typical_p, repetition_penalty, bad_words_ids, force_words_ids, bos_token_id, pad_token_id, eos_token_id, length_penalty, no_repeat_ngram_size, encoder_no_repeat_ngram_size, num_return_sequences, max_time, max_new_tokens, decoder_start_token_id, use_cache, num_beam_groups, diversity_penalty, prefix_allowed_tokens_fn, logits_processor, renormalize_logits, stopping_criteria, constraints, output_attentions, output_hidden_states, output_scores, return_dict_in_generate, forced_bos_token_id, forced_eos_token_id, remove_invalid_values, synced_gpus, exponential_decay_length_penalty, suppress_tokens, begin_suppress_tokens, forced_decoder_ids, **model_kwargs)
   1266         # 0. Validate the `.generate()` call
   1267         self._validate_model_class()
-> 1268         self._validate_model_kwargs(model_kwargs.copy())
   1269 
   1270         # 1. Set generation parameters if not already defined

~/anaconda3/lib/python3.9/site-packages/transformers/generation_utils.py in _validate_model_kwargs(self, model_kwargs)
    962 
    963         if unused_model_args:
--> 964             raise ValueError(
    965                 f"The following `model_kwargs` are not used by the model: {unused_model_args} (note: typos in the"
    966                 " generate arguments will also show up in this list)"

ValueError: The following `model_kwargs` are not used by the model: ['attention_mask', 'label', 'loss_ids', 'tgt_text'] (note: typos in the generate arguments will also show up in this list)
fseasy commented 1 year ago

transformers-4.19.0 亲测这个版本可以。早一点的版本会报 ImportError: cannot import name 'OPTConfig' from 'transformers'). 不过用这个会重新下模型……

miangangzhen commented 1 year ago
image

'attention_mask', 'label', 'loss_ids', 'tgt_text' these arguments should not pass into super().generate() function

ChristLBUPT commented 1 year ago

transformers-4.19.0 亲测这个版本可以。早一点的版本会报 ImportError: cannot import name 'OPTConfig' from 'transformers'). 不过用这个会重新下模型……

感谢!遇到了同样的问题,感觉应该是transformers版本的问题,试了一下果然有用!

想要避免重新下模型的话可以去huggingface.co上直接下载模型的权重,然后在load_plm时把model_path参数设置为你下载下来的模型权重所在的文件路径。

yulinchen99 commented 1 year ago

Hi, the problem has been fixed with higher version of transformers. We rewrite _validate_model_kwargs in PromptGeneration and now it does not perform any checking.