yangheng95 / PyABSA

Sentiment Analysis, Text Classification, Text Augmentation, Text Adversarial defense, etc.;
https://pyabsa.readthedocs.io
MIT License
936 stars 159 forks source link

Loading state_dict fails (model trained on linux and copied to windows) #372

Open christianjosef27 opened 9 months ago

christianjosef27 commented 9 months ago

Version pyabsa==2.3.1 torch==1.13.0 transformers==4.29.0

Describe the bug I used to load my custom state_dict from my windows system and the loading procedure worked. However, I am training now on a linux server for resource reasons. I trained a sample model and copied the whole folder containing the model files to my windows system (.args, config, .state_dict, .tokenizer). Now I tried to load that model in the same way as always but I get errors: (refer to Screenshot for details).

RuntimeError: Error(s) in loading state_dict for FAST_LCF_ATEPC: Missing key(s) in state_dict: "bert4global.embeddings.position_ids".

if not hasattr(ATEPCModelList, self.model.class.name): raise KeyError( "The checkpoint you are loading is not from any ATEPC model." )

Code To Reproduce aspect_extractor = ATEPC.AspectExtractor('fast_lcf_atepc_custom_dataset_cdw_apcacc_75.0_apcf1_74.31_atef1_40.45', auto_device=True, # False means load model on CPU cal_perplexity=True, )

Expected behavior I expect the program to load my custom checkpoint/saved_state_dict.

Screenshots image

christianjosef27 commented 9 months ago

I now have an idea why it does not work. I had a different version of tranformers on linux where i trained (4.35.2), in contrast in windows i have transformers==4.29 which might be the problem when loading the state_dict.