Closed ghost closed 7 years ago
@guoying1030 both BiRNNLayer and BiDynamicRNNLayer have a args n_layer
, you can just simply set the number of RNN layers .
Alternatively, you can stack layers like this example shows: https://github.com/zsdonghao/tensorlayer/blob/master/example/tutorial_ptb_lstm_state_is_tuple.py
@zsdonghao ,, I would like to use multilayer birnn, and should meet the following conditions:
What should I do? Use tensorlayer
@zsdonghao
I would like to use tensorlayer realize neon inside the function of Birnn, neon inside the BiRnn you can set split_input, the meaning of this parameter is: split_inputs (bool): to expect the input coming from the same source of separate Sources, but tensorlayer which birnn inside, I do not know how to set up this parameter? ?? This parameter from the code inside the neon see, assuming that the input is [32,1000,2304], respectively, on behalf of the [batch_size, n_step, hidden_size], then how split_input parameter is set to true, the function is running, [32,1000,1152] is used for forward propagation operations, and [32,1000,1152] is used for backward propagation operations. Then tensorlayer inside how to set up?
close due to repeated issue
tensorlayer support deepbirnn layer????
how to use it??
multi birnn layer support???how to split input??