Lsan2401 / RMSIN

Rotated Multi-Scale Interaction Network for Referring Remote Sensing Image Segmentation
91 stars 6 forks source link

Bug encountered when running train.py #8

Open edwd38165 opened 9 months ago

edwd38165 commented 9 months ago

Hello! Thank you very much for your work. I configured the environment as described in your README, I downloaded the dataset RRSIS-D in two ways and stored them in different files. But when I run train.py I get the following problem:

1.Downloaded dataset from Google Drive Unable to load weights from pytorch checkpoint file. " OSError: Unable to load weights from pytorch checkpoint file. If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True.

2.Downloaded dataset from Baidu Netdisk RuntimeError: Error(s) in loading state_dict for BertModel: size mismatch for bert.embeddings.word_embeddings.weight: copying a param with shape torch.Size([21128, 768]) from checkpoint, the shape in current model is torch.Size([30522, 768]).

I would like to know which version of the dataset is more complete and how I should solve the corresponding problem.

Lsan2401 commented 9 months ago

Hello! Thank you very much for your work. I configured the environment as described in your README, I downloaded the dataset RRSIS-D in two ways and stored them in different files. But when I run train.py I get the following problem:

1.Downloaded dataset from Google Drive Unable to load weights from pytorch checkpoint file. " OSError: Unable to load weights from pytorch checkpoint file. If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True.

2.Downloaded dataset from Baidu Netdisk RuntimeError: Error(s) in loading state_dict for BertModel: size mismatch for bert.embeddings.word_embeddings.weight: copying a param with shape torch.Size([21128, 768]) from checkpoint, the shape in current model is torch.Size([30522, 768]).

I would like to know which version of the dataset is more complete and how I should solve the corresponding problem.

The dataset in both Google Drive and Baidu NetDisk is identical, so the error may not be caused by the dataset. Could you provide more detailed information about the error?

edwd38165 commented 8 months ago

Thank you very much for your response, my specific problem is shown below and I hope you can tell me how to fix it. 1708525462772

PS.my environment is configured as you provided.