WillSuen / LostGANs

51 stars 36 forks source link

Why is the embedding dimension for label embeddings 180 and not 128? #14

Closed silvanmurre closed 3 years ago

silvanmurre commented 3 years ago

First I'd like to say that I appreciate the great work you have done with this paper. I am training the model on my own custom dataset and have been inspecting the code to become more familiar with everything.

When initializing the embedding layer in the generator here the embedding dimension is set at 180 while you mention in your paper that you use d_e = 128 in your experiments. As a result the vectorized representation of the labels (Y in LostGAN-v2 paper) and the embedding matrix (W in LostGAN-v2 paper) then have a dimension of 180 of course. May I ask why this parameter is set at 180 in the code instead of 128? Thank you!

-edit: I think I understand that it has to do with dimensions mismatching in mask regression but I'm still wondering where exactly you get 180 from..

Kind regards, Silvan

WillSuen commented 3 years ago

Hi Silvan, thanks for reaching out. In LostGAN-v1 we use d_e 128 dimension. I think in our experiment, we try out different embedding dimension to explore about it, but there is no significant improvement for different size but we left it as 180 here. Feel free to change it to 128 or just use 180 here. It doesn't affect the performance a lot.l