Open FunkMonkey opened 8 years ago
+1, there are several changes i want to push soon, i could try to squeeze this in, but i don't know when that's gonna happen since lately I don't have much time to spend on synaptic. But if you feel like doing this change by yourself feel free to send a PR!
@cazala Good. I can do it, but we should probably decide first on how we will implement it.
Here are the most important occurrences of Math.random
Trainer
)Considering that the bias
is set in the Neuron
constructor, we could do the following:
{ rng: ... }
Neuron.connection
for the weight
Layer
constructorPerceptron
, etc.)Network.fromJSON
Especially for Layer
and the architects the question is: should we add these options as a final parameter (backwards-compatible) or do we give them a signature by merging the currently existing options and the proposed one into a single object (breaking change)? E.g for Layer
:
function Layer( options ) {
const { size, label, rng } = options;
// ...
}
We can use https://github.com/davidbau/seedrandom for that for now, also it can be used by a third-party. Let's move it to v2 expectations?
We are currently in a phase of finetuning our network parameters for training the network, which is unfortunately hard to do, when the output of the neural network is to a certain degree random and not deterministic.
Thus I propose the possibility to pass an optional random number generator that can be used instead of
Math.random
for initializing biases, weights and for shuffling. Then people can use pseudo random generators like Mersenne Twister.What do you think?