qiaott / MirrorGAN

Pytorch implementation of MirrorGAN
130 stars 40 forks source link

Issues with Training #8

Closed thomas-yu-1 closed 5 years ago

thomas-yu-1 commented 5 years ago

Hi, I am having an issue with training the network.

As I understand, you require two pretrained networks which you use during training. I downloaded the pretrained networks for these from the github links you mentioned, but when I am loading the text encoder: text_encoder = \ RNN_ENCODER(self.n_words, nhidden=cfg.TEXT.EMBEDDING_DIM) state_dict = \ torch.load(cfg.TRAIN.NET_E, map_location=lambda storage, loc: storage) I am getting an error that

size mismatch for encoder.weight: copying a param with shape torch.Size([5450, 300]) from checkpoint, the shape in current model is torch.Size([5453, 300]).

The image encoder is loading properly. I am wondering if you could provide any insight for this?

Also, for the caption models, I saw in the cfg that you use cpkt files, but the pretrained versions of these come in a pkl. Will this be a problem?

Thanks

Bila12 commented 4 years ago

Hello @Mithrandir117. I see that you have closed the issue. Did you manage to solve it ? I am facing the same size mismatch situation.

Thanks

nikunjlad commented 4 years ago

@Mithrandir117 and @Bila12. Hey did you solve the above error. I too am facing the same error! Will really appreciate if you can share the solution. @qiaott Thanks for an amazing code. Can you help with the above issue?

YUMI66666 commented 4 years ago

I also face this problem... could somebody help me with this???

YUMI66666 commented 4 years ago

I also face this problem... could somebody help me with this???

I solved it.. if someone has the same prob. please replace BIRD_CAPTION.PICKLE with the preprocessed data in attnGAN

kmeagle1515 commented 2 years ago

I also face this problem... could somebody help me with this???

I solved it.. if someone has the same prob. please replace BIRD_CAPTION.PICKLE with the preprocessed data in attnGAN

Hey can you please explain more on how you solved?

Thank you