Closed AmitMY closed 7 years ago
theoretically, yes. There is a theorem of universal approximator which proves that a shallow (1 hidden layer) NN with infinite number of neurons with activation function can implement any existing fn. However, in practice NN is not very good at multiplication as they have no functions of that kind inside.
Synaptic does not support custom layers for now, so for synaptic it will not work.
Thanks! I was not aware of that theorem, so thanks for the lesson. I hope v2.0 will have custom functions in it :)
Custom functions are complex thing, let's see. They will not go in an initial release, but - if other people will like to see them we will think how to make it work
Thanks @Jabher I think it opens an option to real growth. (even working with strings, which is not my goal, just showing there are no limits)
@AmitMY I'm not sure that you really need work with strings or something inside NN itself, on my opinion it should be in the other part of flow
I was trying to train a network to do the pythagorean theorem. Levels: 2 -> 2 -> 1 / 2 -> 2 -> 1 -> 1
The requirement is that the network's neuron-transfer function will be able to be f(n) = n*n, or f(n) = sqrt(n) So I was wondering if this ANN can learn these functions?
I also did not see any way to print the final neuron function. is there one? In this case, should print:
f(n, m) = n^2 + 0*m
f(n, m) = 0*n + m^2
f(n, m) = n+ m
f(n) = n^0.5
I understand that not all ANN algorithms support powers, so I wondering if this one does.