NielsRogge / Transformers-Tutorials

This repository contains demos I made with the Transformers library by HuggingFace.
MIT License
8.48k stars 1.33k forks source link

Fine-tune LayoutXLM on XFUND (relation extraction):Fine-tune LayoutXLM on XFUND (relation extraction) #447

Open Still-FanTasy opened 2 days ago

Still-FanTasy commented 2 days ago

when I try to run the code "trainer.train()", error below occured, how to fix it?


TypeError Traceback (most recent call last) File c:\Users\Admin\AppData\Local\Programs\Python\Python39\lib\site-packages\accelerate\utils\operations.py:158, in send_to_device(tensor, device, non_blocking, skip_keys) 157 try: --> 158 return tensor.to(device, non_blocking=non_blocking) 159 except TypeError: # .to() doesn't accept non_blocking as kwarg

TypeError: to() got an unexpected keyword argument 'non_blocking'

During handling of the above exception, another exception occurred:

AttributeError Traceback (most recent call last) Cell In[10], line 1 ----> 1 trainer.train()

File c:\Users\Admin\AppData\Local\Programs\Python\Python39\lib\site-packages\transformers\trainer.py:1539, in Trainer.train(self, resume_from_checkpoint, trial, ignore_keys_for_eval, **kwargs) 1537 hf_hub_utils.enable_progress_bars() 1538 else: -> 1539 return inner_training_loop( 1540 args=args, 1541 resume_from_checkpoint=resume_from_checkpoint, 1542 trial=trial, 1543 ignore_keys_for_eval=ignore_keys_for_eval, 1544 ) ... --> 789 self.data = {k: v.to(device=device) for k, v in self.data.items()} 790 else: 791 logger.warning(f"Attempting to cast a BatchEncoding to type {str(device)}. This is not supported.")

AttributeError: 'list' object has no attribute 'to'