the option use_bias in trainr trigger the use of a bias for network calculation
during the network crossing, I went step by step: weight multiplication, eventually bias addition, unit activation (last one is the sigmoid/logistic/stuff)
if use_bias == T, a new object in the output list is added: bias_synapse
in predictr, if bias_synapse in the list, the bias is added during network crossing
I didn't care about efficiency and calculate the bias anyway, I just take them into account or not during network crossing, I don't think there is a bottleneck here.
the RNG changed because of bias generation so I add a seed in the test_rnn.R and change the result (I needed it to check that the use_bias = F wasn't messing around compare to the last version)
Coverage decreased (-1.2%) to 87.047% when pulling e7ffc91683c132e4baf562eb6e356f7c506d557e on DimitriF:master into fd06501dd3b830274797d3f0d910bbab4c39df86 on bquast:sigmoid.
A few changes:
I didn't care about efficiency and calculate the bias anyway, I just take them into account or not during network crossing, I don't think there is a bottleneck here.
the RNG changed because of bias generation so I add a seed in the test_rnn.R and change the result (I needed it to check that the use_bias = F wasn't messing around compare to the last version)