heykeetae / Self-Attention-GAN

Pytorch implementation of Self-Attention Generative Adversarial Networks (SAGAN)
2.53k stars 475 forks source link

About:AttributeError: 'Conv2d' object has no attribute 'weight' #45

Open 15732031137 opened 4 years ago

15732031137 commented 4 years ago

Hello! Thank you for your contribution to generative adversarial network research and for sharing your code! I am from China. Now is the Chinese New Year. I wish you a happy Chinese New Year! I am very interested in your thesis, when I try to add spectral normalization in my new networks, the program gives the following error: Traceback (most recent call last): File "SR.py", line 45, in train(opt, Gs, Zs, reals, NoiseAmp) File "E:\SinGAN-masterplus\SinGAN\training.py", line 34, in train D_curr,G_curr = init_models(opt) File "E:\SinGAN-masterplus\SinGAN\training.py", line 310, in init_models netG.apply(models.weights_init) File "E:\abcd\lib\site-packages\torch\nn\modules\module.py", line 293, in apply module.apply(fn) File "E:\abcd\lib\site-packages\torch\nn\modules\module.py", line 293, in apply module.apply(fn) File "E:\abcd\lib\site-packages\torch\nn\modules\module.py", line 294, in apply fn(self) File "E:\SinGAN-masterplus\SinGAN\models.py", line 215, in weightsinit m.weight.data.normal(0.0, 0.02) File "E:\abcd\lib\site-packages\torch\nn\modules\module.py", line 591, in getattr type(self).name, name)) AttributeError: 'Conv2d' object has no attribute 'weight'

I have searched a lot of information and couldn't solve it, so I want to ask you, I wish you a happy life, and look forward to your reply!

ShangyinGao commented 4 years ago

Got a similar issue. The reason is that, as shown in line 59 of spectral.py, weight attribute of whatever module you used (in your case Conv2d) is deleted. Instead weight_u weight_v and weight_bar are added.

15732031137 commented 4 years ago

Thank you for your reply!Your idea is great. Have a good life !

A-chen23 commented 4 years ago

Thank you for your reply!Your idea is great. Have a good life !

I encountered the same problem as you. Have you solved this problem? Can you share your solution. thank you.

angrybird-Z commented 3 years ago

楼主怎么解决的啊? 我自己加SN层也出现了这个问题

angrybird-Z commented 3 years ago

解决了 其实就是删除了生成器和鉴别器的初始化,采用pytorch默认初始化方式,原因是SN层那里是复合网络结构,编译器无法拆分到底层找到conv进行初始化。。。所以要加SN层的话要么把netD.apply(weights_init)和netG.apply(weights_init)删了,要么重写SN结构把它和conv2d单独提出来写。。

angrybird-Z commented 3 years ago

解决了 其实就是删除了生成器和鉴别器的初始化,采用pytorch默认初始化方式,原因是SN层那里是复合网络结构,编译器无法拆分到底层找到conv进行初始化。。。所以要加SN层的话要么把netD.apply(weights_init)和netG.apply(weights_init)删了,要么重写SN结构把它和conv2d单独提出来写。。

התפלק לך של סינים

מעשייך מסבירים למה היטלר הרג יהודים.

Liz1317 commented 2 years ago

你好!感谢您对生成对抗网络研究的贡献并分享您的代码!我来自中国。现在是中国新年。祝你中国新年快乐!我对你的论文很感兴趣,当我尝试在我的新网络中添加频谱归一化时,程序给出以下错误: Traceback(最近一次调用最后一次): 文件“SR.py”,第 45 行,在 train(opt, Gs, Zs, reals, NoiseAmp) 文件“E:\SinGAN-masterplus\SinGAN\training.py”,第 34 行,训练中 D_curr,G_curr = init_models(opt) 文件“E:\SinGAN-masterplus\SinGAN\training.py” py”,第 310 行,init_models netG.apply(models.weights_init) 文件“E:\abcd\lib\site-packages\torch\nn\modules\module.py”,第 293 行, 文件“E:\abcd\lib\site-packages\torch\nn\modules\module.py”,第 293 行,在应用 module.apply(fn) 文件“E:\abcd\lib\site-packages\torch\ nn\modules\module.py”,第 294 行,在应用 fn(self) 文件“E:\SinGAN-masterplus\SinGAN\models.py”,第 215 行,在 weightsinit m.weight.data.normal(0.0, 0.02 ) 文件“E:\abcd\lib\site-packages\torch\nn\modules\module.py”,第 591 行,getattr 类型(self)。name , name)) AttributeError: 'Conv2d' 对象没有属性 'weight'

查了很多资料都解决不了,所以想请教一下,祝您生活愉快,期待您的回复!

请问问题解决了吗?

SimonPig commented 2 years ago

The solution is actually to delete the initialization of the generator and the initializer, using pytorch as the default initialization method, the reason is that the SN layer is a complex network structure, the compiler can't remove 分 to the end of the layer, and the conv is initialized... Or 把netD.apply(weights_init)和netG.apply(weights_init)删了,or 重写SN structure put it together and conv2d separately.

Split you of Chinese

Your actions explain why Hitler killed Jews.

That's a little too much.