microsoft / Oscar

Oscar and VinVL
MIT License
1.03k stars 248 forks source link

Oscar fails with torch 2.1 #208

Open pnunna93 opened 8 months ago

pnunna93 commented 8 months ago

Hi, I am running Oscar with torch 2.1 but am facing this error. Could you please take a look and suggest a solution?

Traceback (most recent call last): File "./Oscar/oscar/run_vqa.py", line 1232, in main() File "./Oscar/oscar/run_vqa.py", line 1155, in main global_step, tr_loss = train(args, train_dataset, eval_dataset, model, tokenizer) File "./Oscar/oscar/run_vqa.py", line 477, in train optimizer = AdamW(optimizer_grouped_parameters, lr=args.learning_rate, eps=args.adam_epsilon) File "./Oscar/./transformers/pytorch_transformers/optimization.py", line 128, in init super(AdamW, self).init(params, defaults) File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/optim/optimizer.py", line 266, in init self.add_param_group(cast(dict, param_group)) File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/_compile.py", line 22, in inner import torch._dynamo File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/_dynamo/init.py", line 2, in from . import allowed_functions, convert_frame, eval_frame, resume_execution File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/_dynamo/convert_frame.py", line 46, in from .output_graph import OutputGraph File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/_dynamo/output_graph.py", line 35, in from . import config, logging as torchdynamo_logging, variables File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/_dynamo/variables/init.py", line 53, in from .torch import TorchVariable File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/_dynamo/variables/torch.py", line 131, in transformers.configuration_utils.PretrainedConfig.eq = ( AttributeError: module 'transformers' has no attribute 'configuration_utils'

twoapples1 commented 2 weeks ago

I also encountered this problem, and then I uninstalled the transformer (pip uninstall transformers), reinstalled it(pip install transformers==4.42.3 ), and the problem was solved