Closed rjs211 closed 4 years ago
This is the intended behavior that different layers share the weights to reduce the number of parameters. One similar example can be found in the tensorflow tutorial: https://github.com/tensorflow/tensorflow/blob/0.6.0/tensorflow/g3doc/tutorials/recurrent/index.md
You can also use different parameters for each layer with:
encoder_cells = [DCGRUCell(rnn_units, adj_mx, max_diffusion_step=max_diffusion_step, num_nodes=num_nodes, filter_type=filter_type) for _ in range(num_rnn_layers)]
https://github.com/liyaguang/DCRNN/blob/master/model/dcrnn_model.py#L48
Multiplying a 'cell' object will lead to num_rnn_layer cells but they hava the same weight. Is this the expected behavior?