xinyadu / RGQA

18 stars 1 forks source link

Type error:prepare_inputs_for_generation() missing 1 required positional argument: 'past' #1

Open ChesterXi opened 1 year ago

ChesterXi commented 1 year ago

when in test,one error: File "train.py", line 220, in main() File "train.py", line 211, in main trainer.test(model, datamodule=dm) #also loads training dataloader File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 718, in test results = self.test_given_model(model, test_dataloaders) File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 783, in test_given_model results = self.fit(model) File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 444, in fit results = self.accelerator_backend.train() File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/accelerators/gpu_accelerator.py", line 63, in train results = self.train_or_test() File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/accelerators/accelerator.py", line 72, in train_or_test results = self.trainer.run_test() File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 627, in run_test eval_loopresults, = self.run_evaluation(test_mode=True) File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 578, in run_evaluation output = self.evaluation_loop.evaluation_step(test_mode, batch, batch_idx, dataloader_idx) File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/trainer/evaluation_loop.py", line 169, in evaluation_step output = self.trainer.accelerator_backend.test_step(args) File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/accelerators/gpu_accelerator.py", line 103, in test_step output = self.__test_step(args) File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/pytorch_lightning/accelerators/gpu_accelerator.py", line 111, in __test_step output = self.trainer.model.test_step(args) File "/root/autodl-tmp/RGQA-master/src/genie/model.py", line 114, in test_step sample_output = self.model.generate(batch['input_token_ids'], do_sample=False, File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 26, in decorate_context return func(args, kwargs) File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/transformers/generation_utils.py", line 970, in generate return self.greedy_search( File "/root/miniconda3/envs/EAE/lib/python3.8/site-packages/transformers/generation_utils.py", line 1270, in greedy_search model_inputs = self.prepare_inputs_for_generation(input_ids, model_kwargs) TypeError: prepare_inputs_for_generation() missing 1 required positional argument: 'past'

Is it the transformer version? Requirements requires 3.1.0, but the Sentence Transformer must be at least 4.6.0 What is the problem?

xinyadu commented 1 year ago

pls use exactly the requirements in the readme, we haven't tried other possible requirements yet. e.g.

sentence_transformers=2.1.0 pytorch=1.6 transformers=3.1.0 pytorch-lightning=1.0.6

ChesterXi commented 1 year ago

pls use exactly the requirements in the readme, we haven't tried other possible requirements yet. e.g.

sentence_transformers=2.1.0 pytorch=1.6 transformers=3.1.0 pytorch-lightning=1.0.6

sentence-transformers requires transformers>=4.0.6, which cannot be installed at the same time as transformers=3.1.0, so I forced to overwrite the transformer version to 3.1.0. I ran the experiment successfully, and after trains 6 epoch, I made inference and the F1 value was only 40% I ran the experiment successfully. How can I set up the environment and what is the reason for the huge deviation of the experiment results?

image

image

image

xinyadu commented 1 year ago

did you do the training yourself? have you tried downloading the ckpt and run the inference?

ChesterXi commented 1 year ago

this result comes from ckpt in github.the environment is true?when i override the transformers

------------------ Original ------------------ From: xinyadu @.> Date: Wed,Dec 14,2022 1:50 AM To: xinyadu/RGQA @.> Cc: Chester-Xi @.>, Author @.> Subject: Re: [xinyadu/RGQA] Type error:prepare_inputs_for_generation() missing 1 required positional argument: 'past' (Issue #1)

did you do the training yourself? have you tried downloading the ckpt and run the inference?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

xinyadu commented 1 year ago

I don't get it. Anyway when you directly run inference with the ckpt (correctly), the results should be exactly reproduced.