JC-ProgJava / Building-Neural-Networks-From-Scratch

Building Neural Networks from Scratch book repository.
https://jc-progjava.github.io/Building-Neural-Networks-From-Scratch/
9 stars 2 forks source link
artificial-intelligence book data-science java machine-learning neural-networks tutorial

Building-Neural-Networks-From-Scratch

CodeFactor LOC

Building Neural Networks From Scratch book repository.

You can start reading the book at https://jc-progjava.github.io/Building-Neural-Networks-From-Scratch/.

A license for the code and book is provided at https://jc-progjava.github.io/Building-Neural-Networks-From-Scratch/preface.html#license.

Code can be found in this repository organized by chapter name/number.

Do provide feedback or tips & thanks through their respective forms! šŸ™‚

Feedback: https://forms.gle/P68m2YurzxBecRLv6

Tips & Thanks: https://forms.gle/kEAdNtnJquAsuSGj7

Paperback/Soft Offline Copy

For readers who want to be able to access this book offline or as a soft copy (PDF), download the book here.


Changelog

13 March 2022

Configuration (784-32-10) Testing Accuracy (%)
Adam CCE LEAKYRELU-SOFTMAX LR0.001 96.41
None CCE LEAKYRELU-SOFTMAX LR0.01 95.58
Momentum CCE SIGMOID-SIGMOID LR0.01 95.18
Adam CCE SIGMOID-SIGMOID LR0.001 94.60
Momentum MSE SIGMOID-SIGMOID LR0.01 94.19
Adam MSE SIGMOID-SIGMOID LR0.001 93.82
AdaGrad MSE SIGMOID-SIGMOID LR0.01 93.30
None CCE SIGMOID-SIGMOID LR0.01 92.21
RMSProp CCE LEAKYRELU-SOFTMAX LR0.001 91.40
Momentum CCE LEAKYRELU-SOFTMAX LR0.01 90.96
AdaGrad CCE SIGMOID-SIGMOID LR0.01 90.91
None MSE SIGMOID-SIGMOID LR0.01 88.01
RMSProp MSE SIGMOID-SIGMOID LR0.001 36.67 (beginning to converge)
RMSProp CCE SIGMOID-SIGMOID LR0.001 30.45 (beginning to converge)
AdaDelta CCE SIGMOID-SIGMOID LR1 10.10 (Error)
AdaDelta CCE LEAKYRELU-SOFTMAX LR1 Error
AdaDelta MSE SIGMOID-SIGMOID LR1 Error
AdaGrad CCE LEAKYRELU-SOFTMAX LR0.01 Error
Configuration (784-128-10) Testing Accuracy (%)
Adam CCE LEAKYRELU-SOFTMAX LR0.001 97.40
None CCE LEAKYRELU-SOFTMAX LR0.01 97.31
Adam CCE SIGMOID-SIGMOID LR0.001 97.01
Adam MSE SIGMOID-SIGMOID LR0.001 96.94
None CCE SIGMOID-SIGMOID LR0.01 96.70
RMSProp CCE SIGMOID-SIGMOID LR0.001 95.04
RMSProp MSE SIGMOID-SIGMOID LR0.001 94.62
Momentum CCE SIGMOID-SIGMOID LR0.01 94.56
None MSE SIGMOID-SIGMOID LR0.01 93.61
Momentum CCE LEAKYRELU-SOFTMAX LR0.01 92.75
AdaGrad CCE SIGMOID-SIGMOID LR0.01 86.05
Momentum MSE SIGMOID-SIGMOID LR0.01 28.56 (beginning to converge)
AdaGrad MSE SIGMOID-SIGMOID LR0.01 21.49 (beginning to converge)
AdaDelta MSE SIGMOID-SIGMOID LR1 10.32 (Error)
RMSProp CCE LEAKYRELU-SOFTMAX LR0.001 Error
AdaDelta CCE SIGMOID-SIGMOID LR1 Error
AdaDelta CCE LEAKYRELU-SOFTMAX LR1 Error
AdaGrad CCE LEAKYRELU-SOFTMAX LR0.01 Error

Info:

Observations:

Take a look at the training process:

https://user-images.githubusercontent.com/61588096/158043895-428a6b22-aa19-4c77-9725-a746feb49907.mp4

12 March 2022

4 November 2021

https://user-images.githubusercontent.com/61588096/140303928-ef131a9d-8871-4dbe-b894-83fd28c84f45.mov

18 August 2021

17 August 2021

16 August 2021

15 August 2021

12 August 2021

11 August 2021

10 August 2021

https://user-images.githubusercontent.com/61588096/128802272-0052b94a-cf16-4a7f-9ce7-17b50c6a24c5.mov

26 July 2021

24 July 2021

19 July 2021

9 May 2021

1 January 2021

==========     ===========    ==========     //

          ||  ||         ||             ||   //

          ||  ||         ||             ||   //

          ||  ||         ||             ||   //

 ==========   ||         ||   ==========     //

||            ||         ||  ||              //

||            ||         ||  ||              //

||            ||         ||  ||              //

 ==========   ===========     ==========     //

29 December 2020

20-26 December 2020

19 December 2020

7 December 2020

18 November 2020

13 November 2020

12 November 2020

8 November 2020

6 November 2020

31 October 2020

29 October 2020

27 October 2020

25 October 2020