cazala / synaptic

architecture-free neural network library for node.js and the browser
http://caza.la/synaptic
Other
6.91k stars 666 forks source link

A few quick questions! #233

Open masonmahaffey opened 7 years ago

masonmahaffey commented 7 years ago

1) What is the function of the learning rate when training a network/layers? 2) When building a neural network, how do you determine the minimum width of neurons and/or layers needed for the correct output? 3) Do I need to train the network the correct output for every possible input? i.e. I built a network earlier today and was training it to output 1 only when a specific input was given, however, it was giving the same output no matter what input I gave it. Do I need to train the network to output 0 for every other possible input?

I've been having a blast messing around with this library. Good work to everyone who's worked on it and I'd love to contribute to it in the future once I've mastered NN's.

THANKS!!

wagenaartje commented 7 years ago
  1. I'm not completely sure what you meant here, but: The default function is fixed, so the learning rate stays constant over time. You can also pass on an array, with different learning rates, which it will divide equally over the iterations.
  2. You really can't. This is something that has to be done by trial and error. More info here
  3. No you don't. But in your case, you only have 1 test samples and there is no relationship between different test samples. Normally, if you train a neural network you provide a lot of data, and train the network on 80% percent on it. Then you test how well the network performs on the other 20%, which it was never trained on. However, for small datasets, there is no guarantee that the network will also work for values that are not in the input samples.

Awesome to have another enthusiast here!

masonmahaffey commented 7 years ago

1) Ahh, ok, so that was bad wording on my part in regard to the first question. What I really meant was, what is the purpose of the learning rate/ what is the learning rate?

2) Gotchya, that's what I've been reading elsewhere and I've found some articles detailing general best practices to initially try so you don't get way off from the ideal architecture when testing to find it.

3) Yes, I see now. I never took into account the fact that ultimately NN's are used to recognize similar patterns within different datasets. In my case, I might as well have used a simple if statement lol.

That must have been a record response time for a GitHub repo issue; I'm glad to see you guys care about this project.

This is incredibly helpful, btw.

wagenaartje commented 7 years ago

The learning rate is a variable used during backpropagation, specificely during the weight and bias update. Basically, the algorithm calculates if a weight should be larger or smaller, basically a direction to go to for the weight (up/down). The learning rate tells the algorithm: we should move this much higher, not too much, so we don't surpass the target, and not too little, otherwise we won't learn anything.

Training parameter that controls the size of weight and bias changes in learning of the training algorithm. source

More info here.

No problem :)