Closed SunnerLi closed 7 years ago
Hi, you can put the activation function into BatchNormLayer
, this is how I do :
But I am not sure what do you mean by "concatenate the other part of the network
", here is some tricks summary by others:
Hope it can solve your problem.
@zsdonghao I'm sorry that I miss the activation arg in batchNormLayer
......
It's my fault.....
Thanks for your model script!
And the trick link is a really good demonstration (or tutorial).
Recently I want to use tensorlayer and tensorflow mutually, but I encounter a trouble. For my design, I want to use batch normalization between dense layer and activation layer. Moreover, it's recommend to use this mechanism to reduce the variance. It's not problem that add
batchNormLayer
afterDenseLayer
, but I should to use native tensorflow function to reach the goal which returns tensor object. If I want to concatenate the other part of the network, the redundantInputLayer
should be added.For example:
The idea is from here. Is there any other method to use tensorlayer more directly? Does tensorlayer provide any other way to become more flexible, or adding such this
Inputlayer
is the only solution?