Open karlarnejo opened 5 years ago
I know Encog does not return the same outputs in all cases, so this is something to do with this unique case, so I am marking as support. Usually when a neural network outputs the same results, the data (or at least how it is represented) is not well enough preprocessed that any patterns can be found, so the neural network is somewhat averaging.
There seems to be no problem when training my network because it converges and falls below 0.01 error. However when I load my trained network, and introduce the evaluation set, it outputs the same results for all the evaluation set rows (the actual prediction, not the training phase). I trained my network with resilient propagation with 9 inputs, 1 hidden layer with 7 hidden neurons and 1 output neuron. UPDATE: My data is normalized using min-max. i am trying to predict an electric load data.
Here is the sample data, first 9 rows are the inputs while the 10th is the ideal value:
Here's the full code:
}
Here is my result:
Results im expecting (I changed the "predicted" with something random for demonstration purposes, indicating that the network is actually predicting):