ennauata / housegan

House-GAN: Relational Generative Adversarial Networks for Graph-constrained House Layout Generation
https://ennauata.github.io/housegan/page.html
Other
240 stars 67 forks source link

Can't have outputs with a different --latent_dim (default=128) #16

Closed rickkk856 closed 3 years ago

rickkk856 commented 3 years ago

I was trying to figure it out how the model would work with a bigger or smaller latent space... but while running

%run variation_bbs_with_target_graph_segments_suppl.py --batch_size 1 --channels 1 --exp_folder exp --latent_dim 64 --num_variations 4

For --latent_dim 64 For --latent_dim 256

I got similar errors...


RuntimeError Traceback (most recent call last) ~\housegan\variation_bbs_with_target_graph_segments_suppl.py in 212 z = Variable(Tensor(np.random.normal(0, 1, (real_mks.shape[0], opt.latent_dim)))) 213 with torch.no_grad(): --> 214 gen_mks = generator(z, given_nds, given_eds) 215 gen_bbs = np.array([np.array(mask_to_bb(mk)) for mk in gen_mks.detach().cpu()]) 216 real_bbs = np.array([np.array(mask_to_bb(mk)) for mk in real_mks.detach().cpu()])

~\anaconda3\envs\housegan\lib\site-packages\torch\nn\modules\module.py in call(self, *input, kwargs) 548 result = self._slow_forward(*input, *kwargs) 549 else: --> 550 result = self.forward(input, kwargs) 551 for hook in self._forward_hooks.values(): 552 hook_result = hook(self, input, result)

~\housegan\models.py in forward(self, z, given_y, given_w) 147 if True: 148 y = given_y.view(-1, 10) --> 149 z = torch.cat([z, y], 1) 150 x = self.l1(z) 151 x = x.view(-1, 16, self.init_size, self.init_size)

RuntimeError: Sizes of tensors must match except in dimension 0. Got 10 and 5

PS: I'm using the pre-trained model

ennauata commented 3 years ago

Hi @rickkk856, I haven't explored changing the latent dimension much, so this feature is currently not fully implemented. If you want to change the latent dimension you would have to manually change the network's input layer accordingly. Also, the pretrained model assumes some latent dimension (i.e. 128), I believe you would not be able to re-utilize all the pretrained weights after changing the latent dimension, thus you would have to re-train the model from scratch using the new dimensions.