Traceback (most recent call last):
File "/home/usr/Python/VQA/Cola/query/query_ofa.py", line 125, in <module>
run_ofa(args)
File "/home/usr/Python/VQA/Cola/query/query_ofa.py", line 99, in run_ofa
gen = ofa_model.generate(
^^^^^^^^^^^^^^^^^^^
File "/home/usr/anaconda3/envs/cola2/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/usr/anaconda3/envs/cola2/lib/python3.10/site-packages/transformers/generation/utils.py", line 1597, in generate
model_kwargs = self._prepare_encoder_decoder_kwargs_for_generation(
It seems that the version of torch and transformers causes this error, could you tell me your environment?
I tried on the following environment.
Hi, thank you for your great work! Following your setup instruction, I ran the commands below.
Then, I got an error.
It seems that the version of torch and transformers causes this error, could you tell me your environment? I tried on the following environment.