Open Setherizor opened 7 years ago
The amount of layers is in no way correlated to the amount of outputs. Please read through Neural Networks 101 to get a basic understanding of neural networks.
The difference in activation time is that the network optimizes itself on its first activation. This makes the activations after the first activation faster.
But in my application, this is not the case. The first 50 or so activations work flawlessly, then about 7-17 activations, take over 70 ms.
if you are training the network in the same time you might see some activations taking more time than usual, which is totally normal
My network is a 3 layered perceptron, why when activated do 4 outputs come out even when there are only supposed to be three?
Other issue, why would activating said network at times take almost no time, and other times upwards of 80 ms?