Epistimio / orion

Asynchronous Distributed Hyperparameter Optimization.
https://orion.readthedocs.io
Other
283 stars 52 forks source link

Perturbing Weights and Neural Firing Correlation #110

Closed dendisuhubdy closed 6 years ago

dendisuhubdy commented 6 years ago

So last night I had an epiphany, I was thinking on how the brain works and there is so far no global loss function that our brain minimizes or if there is any there are only local losses related to each synaptic response.

So I thought of maybe something else, not evolutionary too. There has been a method in which the neuron itself fires given an input and say result in y. Given another set of input it fires and it goes to y_1 which is the wrong firing output, which it should be at y_1', the \delta{y_1' - y_1} would be adjusted in the next iteration.

So there are only two solutions to this, the weights are perturbed in a way that there are local adjustments to outputs or the firing or there is a sudden highway of information between two unconnected neurons. See http://neuronaldynamics.epfl.ch/online/Ch19.S1.html

In this case, I thought of scrapping backpropagation totally and try a new measure. Why don't we add weights as our hyperparameters and perturb it with Oríon.

[Mumbling in my own thoughts]

bouthilx commented 6 years ago

I don't understand how you intend to modify the weights to minimize an objective. I'd be happy to discuss this, but the issues should not be used for research discussion and we should move this to slack.