Closed taochenshh closed 7 years ago
you may find this function can help tl.layers.list_remove_repeat and instead of adding loss one by one, you can also use a for loop.
As I understand, the following script can solve your problem.
cost = 0 for p in tl.layers.list_remove_repeat(net1.all_params + net2.all_params) cost = cost + tl.cost.maxnorm_regularizer(1.0)(p)
Thanks, this solved my problem.
I see from the api documentation that I can add regularization loss like this:
cost = cost + tl.cost.maxnorm_regularizer(1.0)(network.all_params[0]) + tl.cost.maxnorm_regularizer(1.0)(network.all_params[2])
But suppose I have a neural network that has multiple output layers with only one input layer, say
out1, out2
, I think thenout1.all_params
andout2.all_params
will have some common params. So I cannot use the method above? Of course, I can use aset
to get the unique elements after combining two lists, but is there any more elegant way to handle this in TensorLayer? BTW, why are the weights and biases variables not created with keywordregularizer=
which can add the variable loss totf.GraphKeys.REGULARIZATION_LOSSES
?