Closed eeegnu closed 6 years ago
Hi,
thanks for pointing out the mistakes in the figures. I think I fixed them.
I didn’t understand what you meant with the last part:
And while the example in the doc says LW 3,2 has 4 weights, there are only 2 in the corresponding example image (2 neurons in the M-1th hidden layer connect to a single neuron in the output layer), is this to say that it is still stored in a 2x2 matrix but two of those weight variables are actually zero's?
Best regards
Dennis
Von: eeegnu notifications@github.com Gesendet: Freitag, 15. Juni 2018 03:30 An: yabata/pyrenn pyrenn@noreply.github.com Cc: Subscribed subscribed@noreply.github.com Betreff: [yabata/pyrenn] Errors in documentation (#7)
Was reading through the documentation and there were a few things I believe to be mistakes.
In this image the middle weight going from layer 3 to layer 2 should be LW 2,3 [d], rather than LW 1,2 [d]. https://raw.githubusercontent.com/yabata/pyrenn/master/doc/img/recurrent_nn.png
Here layer 1 is missing its bias values https://github.com/yabata/pyrenn/blob/master/doc/img/MLP2221_detailed.png
And while the example in the doc says LW 3,2 has 4 weights, there are only 2 in the corresponding example image (2 neurons in the M-1th hidden layer connect to a single neuron in the output layer), is this to say that it is still stored in a 2x2 matrix but two of those weight variables are actually zero's?
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/yabata/pyrenn/issues/7 , or mute the thread https://github.com/notifications/unsubscribe-auth/AI2WJs-YWnwI1H71t2jEYijmvJtckGdvks5t8w4mgaJpZM4Uo47w . https://github.com/notifications/beacon/AI2WJsraCtzlNu8rN8az_blbl3c_BHKQks5t8w4mgaJpZM4Uo47w.gif
Hi, I've put it all into one picture since it is a bit confusing referencing it otherwise.
This was a mistake. Thanks!
Hi,
Just one more comment, I think you meant to convert the 2x2 matrix LW^3,2 into a row rather than a column in the updated docs, since there is no w^3_2,1.
Best, Eugene
👍
Was reading through the documentation and there were a few things I believe to be mistakes.
In this image the middle weight going from layer 3 to layer 2 should be LW 2,3 [d], rather than LW 1,2 [d]. https://raw.githubusercontent.com/yabata/pyrenn/master/doc/img/recurrent_nn.png
Here layer 1 is missing its bias values https://github.com/yabata/pyrenn/blob/master/doc/img/MLP2221_detailed.png
And while the example in the doc says LW 3,2 has 4 weights, there are only 2 in the corresponding example image (2 neurons in the M-1th hidden layer connect to a single neuron in the output layer), is this to say that it is still stored in a 2x2 matrix but two of those weight variables are actually zero's?