Closed joseph-chan closed 7 years ago
input_bias = ParamAttr(name='input.bias', initial_mean=0., initial_std=0.)
hidden_attr = ParamAttr( name = 'hidden.w',
l2_rate= 1e-3, initial_mean = 0., initial_std = 1. / np.sqrt(self.hidden_layer_dim / 2.))
hidden_bias = ParamAttr(name='input.bias', initial_mean=0., initial_std=0.)
bias 的问题, 这里相同的名字制定了两次... 一般情况,bias 默认初始值都是0,直接
paddle.layer.fc(input = hiddenlayer, size = self.input_layer_dim,bias_attr = True, act = paddle.activation.Softmax())
就行
确实是粗心,名字”input.bias“重复了,感谢帮忙指出问题
在训练一个简单的三层结构的autoencoder时,出现网络的参数size的问题。整个网络结构的代码如下,为什么会出现参数数量不一致错误? input_layer_dim = 100 hidden_layer_dim = 1024 The network structure is as follow:
出现以下错误: