Closed Nick7hill closed 7 years ago
Could you please refer to the lines of code related to this issue? If you look at line https://github.com/minsangkim142/R-net/blob/2d9d27431ed813696af947dc87b8c93f50a71356/model.py#L88-L99 I use Params.num_layers to control the number of layers, and Params.num_layers is 3 by default.
num_layers = 3 # Number of layers at question-passage matching and self matching network
alright, my bad , thanks for the explanation!
i think you have used 2 bi-directional GRU's instead of 3 as mentioned in the paper ,to encode questions and passages.